Lucid Labs’s CTO, Jerome Guillen, on the future of robotics

In 2009, after only a year as Tesla’s senior vice president of engineering, Jerome Guillen helped engineer the Model S’s first public unveiling in Los Angeles. Seven years later, he arrived at Lucid Labs to lead its burgeoning robotics research department. In the three years since, the Palo Alto, Calif.-based company has designed its first prototype, conducted its first hardware tests, and used robots to build its next-generation Learner’s Car. “I am fascinated by the acceleration of innovation in electronics,” says Mr. Guillen, who is also Lucid’s chief technology officer. “We are iterating incredibly fast.” The company has raised $746 million in private equity to date, according to the Bloomberg Billionaires Index. Here, Mr. Guillen talks about Lucid’s robotics department, which is currently recruiting at San Francisco’s Silicon Valley’s Pier 48.

Tell us about Lucid’s robotics department.

We are a robotics company and that is not our primary focus. [We’re] building computers; cars; robotics; machine learning and artificial intelligence; and software that allows our customers to save money through robotics.

What were your goals as Lucid’s first robotics research chair?

We started off as a robotics company and we needed a way to have humans inside of our test pods. We needed to test out how robots would work and how humans would function within robot-controlled environments. So I spent three years working with Red Tops, a company that engineers and designs robots, to make a system that simulates what we would do for the future. We use Red Tops inside our open-systems robot. We are building this system out of open systems so that people can build on top of it.

What are the main areas of robotics research at Lucid Labs?

I see our robots having an impact in a lot of different arenas. One area is our systems engineers, who are using our motor control technology to help autonomous vehicles get in and out of places. We also are working on more sensitive wireless sensors that are being used to map the world, [creating] simulated cities and [measuring] things like pollutants and radiation. Then, of course, we are working on humanoid robots that are intended to use cognitive reasoning in highly controlled environments. We are also working on robotics to help our customers work smarter so they can drive more efficiently, like in self-driving cars, or using it in the automotive aftermarket to improve seat posture, which makes it more comfortable and also helps prevent back problems. And finally, we are exploring ways to use robots to help people with physical and cognitive disabilities. We have had success in testing robotic assistants for patients. We have been approached by insurance companies and nursing homes who said that they’d like to have robots inside. It would let them cut the number of people they have to hire and help them better care for the patients.

What changes have you seen in the way that Lucid technologies are used, compared to five years ago?

Automation — in terms of everything from video and imagery — was [thought] necessary, but even that was just opening up what was possible with remote-sensing technology. No one thought that there was the potential for a linear evolution that will happen. Now, everyone can see the things that autonomous vehicles will achieve. I don’t know if they can get to something that is just as innovative as Tesla, but that’s possible.

The Daily Beast | Lucid Labs

Leave a Comment