Ahead of 'Paradigm Shift', MemryX Seeks Series B Raise To Develop AI Accelerator Tech

Zinger Key Points
  • MemryX is developing a semiconductor device designed to efficiently run AI.
  • "We've started sampling our hardware and we have it out to two dozen companies," Peene says.

The late Stephen Hawking once said: “Success in creating effective [artificial intelligence], could be the biggest event in the history of our civilization. Or the worst. We just don’t know. So, we cannot know if we will be infinitely helped by AI, or ignored by it and side-lined, or conceivably destroyed by it.”

There’s truth to that, according to Roger Peene, the vice president of product and business development at Michigan-based MemryX.

“AI will be pervasive in everybody's lives,” he told Benzinga. “AI is a complete paradigm shift where it is based upon statistical analysis and models you train but get unexpected outputs, at times.”

See Also: Elon Musk's Brain-Chip Company Delays 'Show And Tell'

Peene has been in the semiconductor industry for 30 years. After earning an electrical engineering degree, he spent 16 years at Intel Corporation INTC, before heading to Micron Technology Inc MU. He eventually left the corporate world to join what he calls a "true startup" — MemryX.

Read on to hear more of Peene's thoughts on the AI revolution, in-memory computing, the work MemryX is doing and how it will impact the world:

The following text was edited for clarity and concision.

BZ: What inspired the transition from a large corporation to a startup?

Peene: One, as you progress in your career, the ability to create something from nothing became very appealing to me. Second, it's the field of AI and I love being at the forefront of technology. For example, when SSDs came to the market, I jumped into that field early on. Now, just about every single computer and data center deploys SSDs. It's mainstream. I see a similar evolution for AI-on-the-edge devices. AI will become pervasive, and there is a synergy between what I did before in-memory storage and AI.

Are larger institutions working on these ideas, as well?

Some of the larger companies like NVIDIA Corporation NVDA leveraged their GPU technology in the AI space early on. It used to be that innovation came from big companies. One of the key themes that I have seen over the last couple of decades is tremendous consolidation, as well as paring back on research and development (R&D) — OpEx, not CapEx — to focus on core technologies, and on what they’re delivering to the market.

You see a lot more R&D being done by startups. And then they get acquired by the larger companies, versus organically developing and growing within the walls of larger companies. For example, Intel acquired Movidius, an AI company. Rather than develop the technology ground up, they wait until the startups get momentum — then they acquire.

What are you focused on in your role?

I am the VP of product and business development. There are four individuals that are currently running the company: Keith Kressin, a prior senior VP at Qualcomm, Dr. Wei Lu, who is a neuromorphic expert coming out of the University of Michigan, and Puneeth Singh, the VP of engineering.

Our company is post-Series A and developing a semiconductor device — an accelerator — that is designed to efficiently run AI neural networks on the edge. We’re talking about surveillance cameras, drones, and automobile metaverse applications. My specific task is to define our future roadmap, create all of our marketing messages, and, then, ultimately secure design wins with top customers.

What are the use cases with this technology?

In the area of transportation, there are a lot of different use cases forming with cars and fleet vehicles. One would be around vehicle monitoring. All the cameras and AI models would detect potential conflicts around the vehicle — a car, bicycle, dog, or human — and alert the driver.

Another is in-cabin monitoring. So, especially for commercial fleet vehicles, there's a camera on the driver to see if the driver is distracted and not paying attention. The monitor would signal an alert that says: “Hey, you know, time to drink a cup of coffee, wake up, or something.” There are also driver-assist applications.

Surveillance cameras, too. Anybody deploying surveillance cameras and looking for intelligence from that surveillance feed. So, when you put AI into the camera, it can send back information that alerts or initiates a task based on what's going on in the camera feed. For example, with a camera, an AI can set up invisible boundary lines. Then, let’s say a forklift drifts outside that boundary line. The AI will immediately send an alert.

Is this computer vision and AI easy to use? Are there important considerations?

If you're on loading docks where the lighting changes or if they are outside cameras, when it's raining, dark, or snowing, that may cause issues. However, AI models will often filter out that noise, and see beyond what's truly going on with the environment. You can take it one step further. Go to Industry 4.0 and robotics. Obviously, AI with robotics and manufacturing is really important. So on a manufacturing line, there's a lot of robotics. And in those robotics, there are cameras. If you put AI in those cameras for defect density or defect analysis, you achieve significantly more precision than the human eye.

Is there value to Elon Musk’s remarks on cameras being far superior to LiDAR?

Correct. Musk is not a fan of LiDAR. He basically says the best vision system is the human eye. How can we replicate the human eye with technology? Cameras.

Do Musk’s concerns with respect to AI have any merit?

We could get into a political or ethical debate. Like any technology, it can further mankind. It could be used to improve people's daily lives, and it could be used in many positive ways, similar in the way that a smartphone benefits us. However, it could also be used in nefarious ways that repress societies. With AI, moving forward, there will probably have to be regulation.

Have you partnered with any big names?

We've started sampling our hardware to two dozen companies who are currently evaluating it. We expect that to grow over the next coming months.

How do you compete with more established names?

The job of marketing is to accelerate awareness and deployment of any technology in the marketplace. As our awareness goes out there, through articles on interviews with our CEO, for instance, we are actually getting proactive inquiries from large companies including major defense contractors and OEMs, as well as large surveillance companies around the world.

Further, there are a few key value vectors we offer. Really good performance, low latency, and good cost structure to me are table stakes. Some of our competitive advantages include simplicity and ease of use to accelerate deployments, as well as scalability.

Because we’re nimble, a lot of our customers are able to get up and running in hours with our technology, whereas with the other folks, it could take weeks or months to get up and running.

How do you, as a startup, keep costs down?

We're a 37-person startup and post-Series A. To date, we raised about $18 million, and we've done a tremendous amount on a shoestring budget. You can do a lot with the right people and skill set. Our key architects came from the University of Michigan. We have several PhDs that’ll roll up their sleeves, and that’s keeping costs down.

Where does a lot of this production happen?

Much of the semiconductor manufacturing is done by Taiwan Semiconductor Mfg. Co. Ltd TSM in Taiwan. Others include Globalfoundries Inc GFS. TSMC is the behemoth in the industry.

How does this whole globalization pulse impact your business?

If something should happen in Taiwan, it would have a significant impact on the semiconductor industry, in general. We’re like small potatoes on a global scale, though, and, from a volume perspective, we could move to, let's say, Globalfoundries, or someone else, very simply. Look at somebody like NVIDIA, which is also a fabless company and they're tens of billions of dollars in revenue with significant volumes. They can't move.

How long does it take to make chips?

Depends on the complexity of the chip and how much pre-work you did. So we'll break it down. There's the design phase of one of these chips. Depending on the complexities of the design, it can take anywhere from six months to five years to design. MemryX was founded in 2019. For us, it took about three years. The reason for that is because it's a brand new architecture. I think nine to 12 months is pretty reasonable.

How will MemryX scale?

We’re doing a Series B fundraising to keep the business going until we get into revenue. That's our focus. Once we have our chip and we're able to sell it, then we could turn on the volume. We've gotten feedback from customers and aim to go quickly into production.

Anything to mention before we conclude?

AI will be pervasive in everybody's lives. To add, AI is not deterministic. It is a paradigm shift and based upon statistical analysis and models you train but get unexpected outputs, at times. The whole structure of AI is intended to mimic how the human brain operates, and it's getting better every single year. The things it can do now were unheard of in traditional computing.

Photo by Tara Winstead from Pexels.

Market News and Data brought to you by Benzinga APIs
Comments
Loading...
Posted In: NewsFinancingTop StoriesStartupsTechInterviewMemryXRoger Peene
Benzinga simplifies the market for smarter investing

Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.

Join Now: Free!