Inside the Silicon Valley Robotics Scene: An Interview with Ted Larson of OLogic

An interview with Ted Larson of OLogic

·
August 31, 2021

Robots have captivated us all for decades. We’ve speculated in SciFi films and novels on how human and AI interactions might go down in a range of future scenarios. Some depictions have been positive, while others have been…not so positive. Meanwhile, robotics stands as one of the most exciting industries to be involved in for tech people and designers alike. Over the years, Whipsaw has been privileged to design robots featuring some of the most cutting-edge technology in Silicon Valley, from home assistants to dog companions.

One of the more recent developments in this sector has been the merger of artificial intelligence (AI) and the Internet of Things (IoT). It’s this merger that lets devices customize their user experience in breakthrough new ways. What’s more, these technologies are stealthily integrated into innovations you might not expect, like the Tonal Strength Training System.

I recently interviewed OLogic Inc. CEO Ted Larson for his perspective on the evolution of the Silicon Valley robotics scene, the relationship between AI and IoT, and where it’s all headed.

On the beginnings of his career in the robotics sector of Silicon Valley

Larson: So, if we really rewind the clock, I officially embarked on this journey while getting my master’s in Computer Science at California Polytechnic State University. From there, I interned at Los Alamos National Laboratory in New Mexico and then went on to spend five years at Hewlett Packard.

It was an exciting time to be in the Bay Area. I began consulting on the side for early tech startup types and finally launched my startup in ’96. Things went smoothly until the venture capitalist market completely fell apart after the tragic attacks on September 11, 2001.

My company was struggling, so to lift my spirits I did a lot of tinkering in my spare time. I ended up joining the HomeBrew Robotics Club in Silicon Valley. Those guys were awesome. They were making homemade robots in their garages and bringing them into a club setting to show us how they made them.

I was, and am, most interested in robots that solve really mundane tasks, and perform them well.

Myself and Bob Allen, another robot club member, started building really interesting robots together. Then in 2005, the idea arose that we should pitch ourselves as robot consultants. We thought, “If this works out, we could actually get paid to develop robots!” Things accelerated for us when the VP of Engineering at Hasbro, the second-largest toymaker in the world, became a fan of our work. That’s how we began developing products for them through the years. That early work was all really a crash course in how to design the insides of products and how to scale them, including both the Savioke and Cobalt robots.

In 2011, I ended up hiring a few of my friends, including several from the robot club, and we took on all sorts of projects in the consumer electronics space. Willow Garage, a robotics research lab and technology incubator, had just invented ROS and built the PR2 robot. Many of our friends had gone to work there, so we were able to get in on the ground floor of the robotics scene. We got the Savioke robot that way as well because when Willow ended in 2014, its CEO went on to start Savioke. These technologies all responded to different needs in very different market segments, like delivering to the hospitality and medical sectors.

On making innovative robots

Larson: I was, and am, most interested in robots that solve really mundane tasks, and perform them well. Like the Servi food service robot. I’m excited about that because it’s just really cool, and Bear Robotics is making a lot of headway in the restaurant industry with this innovation. Dusty Robotics is also amazing. It basically drives around the inside of a building being constructed and uses inkjets to print its floor plan for the crew in just a few days. It would take a crew weeks to do that by themselves.

Servi Food Service Robot

AI and IoT, in layman’s terms

Larson: So, if you take any device that can be powered on or off and connect it to the Internet, (where it then connects to other connected devices), that’s the Internet of Things. When you combine that with the artificial intelligence used to process and learn from that data, every device within this massive network starts to evolve. The user of the device then enjoys an experience tailored to their specific needs and goals. This union is poised to assist the evolution of countless industries, from automotive to medical to sports.

Another term I should explain is what we call “Edge IoT.” Edge Computing refers to when data is processed and stored at the system’s “edge.” (As in, it’s stored near the devices creating the data themselves.) Most Edge IoT items involve computer vision, where transferring enormous volumes of data to the Cloud for processing is impractical. The current trend in this space is to smash all the heavy data on the IoT device and then just send the important results to the Cloud. That makes for a faster speed and lower latency. You’re also no longer open to vulnerability attacks or bandwidth constraints.

Edge computing really makes an impact in areas like home health monitoring, home robotics, and connected fitness…Essentially, anywhere your data needs to be ultra-secure, and where processing needs to happen in real-time at the point of origin.

Connected fitness opened up an entirely new market that allowed innovations like Tonal to emerge, and these behind-the-scenes technologies enabled such innovations to thrive.

How the marriage between AI and IoT shows up in innovations like the Whipsaw-designed Tonal

Larson: Connected fitness opened up an entirely new market that allowed innovations like Tonal to emerge, and these behind-the-scenes technologies enabled such innovations to thrive.

Tonal’s built-in AI, for example, pulls from its user’s history to assess their strength level, and it sets its weights accordingly. Edge IoT also enables Tonal’s sensors to monitor your movements and energy levels, and it makes adjustments to help you reach your desired fitness goals. This is a remarkable upgrade from traditional fitness equipment.

Tonal

Larson: Another area where that marriage is having a big impact is eLearning. In this case, children might be doing their homework in front of a camera, for example, and the camera will help to grade their homework as they go.

Smart cameras and sensors that help with data transmission are frankly helping industries across the board. In the restaurant industry, they’re used to quickly perform quality control of food. In the retail industry, they help managers determine when customers will approach the checkout line based on their movements. And in the automotive industry, they help self-driving cars predict pedestrian behavior and maneuver accordingly.

This type of system is also often found whenever you see robots in manufacturing. In the waste management sector, for example, companies can use smart ‘robot arms’ to sort recyclables from trash.

A lot of Edge IoT is based around the idea of doing smart things with regular cameras or allowing 3D cameras to help make decisions out on the Edge. The cool thing is that the device’s AI studies and learns from the algorithms. It’s a constantly evolving process.

Robotics are creating opportunities for people with disabilities

Larson: It’s wonderful how the robotics industry has opened opportunities for the disabled community. For example, people with disabilities can use virtual reality-controlled robots to perform tasks they wouldn’t be able to otherwise, such as stocking shelves in a store or packing boxes in a warehouse. In the future, we’ll see broader applications of this technology, and people previously excluded will be able to contribute to society in all sorts of new ways.

I don’t think personal interactions will ever be fully automated out.

The role of AI, automation, and job security

Larson: Large corporations are currently making a huge effort to train AI. The Facebooks and Googles out there have enormous teams that train AI how to recognize things. Team members will draw circles around stop signs, for example, so that AI will then be able to recognize stop signs in photographs. You know when you’re logging in to a website and trying to prove you’re human by identifying the bicycles in a set of images? Well, eventually, AI will be able to bypass those blockers. They will know what bicycles look like in all operating systems in all conditions because of the millions of people currently training them.

Larson: Look, I’m old enough to remember when ATM machines first appeared on the scene. It was suddenly super convenient to get money out of your bank. Initially, bank tellers didn’t lose their jobs. There was a period of adjustment. Eventually, however, banks like Wells Fargo turned entire branches into ATMs. I still don’t know if that was a good move because I actually like having personal connections with other human beings. People like me still sought out branches where they could have personal contact. I do know that the experiment some banks tried of closing all their branches and turning themselves into ATMs didn’t totally work out for them. I don’t think personal interactions will ever be fully automated out.

In some cases, you already have the jobs of unskilled workers being replaced by robots. On the bright side, many of those workers are also being trained to care for those robots, so they’re often, ideally, being paid more for a different job.

Over time, people will simply have to deal with all the automation in their lives. I think about teaching my 90-year-old father how to summon an Uber with a smartphone, for example. That type of automation has opened up a new world for him. He honestly has a new life because Uber gave him back his mobility. In most cases, automation is just a matter of convenience.

To create a hit, the product first needs to look like it belongs in the environment it’s intended to be used in.

The role of design

Larson: Design is critical for a new product. The sizzle sells the steak. You want an alluring end-product that you can show your customers, and which they can visualize working in its environment. To create a hit, the product first needs to look like it belongs in the environment it’s intended to be used in. I’ve seen a lot of products fail because they did their job well, but they looked horrible while doing it.

Design is the product’s first impression. The form way outweighs the function. I can’t afford a Lamborghini, but I’ll look at pictures of one because the design is so phenomenal. When developing a robot, you have a choice: You can design it yourself (and risk people judging it because it doesn’t look good), or you can go through the industrial design process and create a product that’s more impressive to watch while doing its thing.

The future of robotics

Larson: I’ve developed robots throughout my career, and I really believe the union of AI and IoT represents the future of the robotics industry. I also think things are moving out of the Cloud and onto the Edge. Edge computing is the new frontier. It’s also an evolving field, so we don’t really know where it will end up. I’ll personally be working on Edge-related projects going forward.

*The Global Edge Computing Market is expected to reach $18.7 billion by the year 2027.

Share this article