Planning and execution are two very different beasts. A project may appear straightforward on paper, staying within budget, on schedule, and technically sound, only to hit roadblocks in the real world. However, turning ideas into reality isn’t always smooth, and success depends on how well we anticipate and navigate the unknowns. This gap between concept and execution is especially pronounced in the fast-growing realm of edge artificial intelligence (AI). In a recent roundtable discussion...
Everyone, it seems, is now talking about how they’re planning to integrate AI into their devices, their factories, their workflows and, well, everything. But how they actually plan to make that happen isn’t always clear. Part of the challenge, of course, is that different workloads and different environments require different types of solutions. For those looking to integrate AI-powered capabilities into edge computing-based offerings, there are a relatively broad range of ways to ac...
Human-machine interfaces (HMIs) are rapidly evolving, driven by trends such as Automotive personalization, sustainable always-on interfaces, hygienic touchless user interfaces (UI), consistent user experience (UX) across platforms, voice activation, and Industrial automation for labor and safety needs. Regardless of their specific drivers and/or use cases, modern HMIs must be smarter and more dynamic – shifting from command-based to context-aware systems that bridge the human-machine...
Edge AI is reshaping how machines interact with the world, bringing intelligence closer to the data source and enabling real-time, context-aware decision-making. In Automotive and Industrial environments, this shift is driving smarter sensors, automation, and enhanced Human-Machine Interfaces (HMI). But deploying AI at the edge comes with constraints: limited computing capacity, strict power budgets, and compact hardware footprints. In our latest LinkedIn Live panel discussion, Lattice expert...
In the past, Human Machine Interface (HMI) were relatively simple, consisting of buttons, knobs, levers, and static displays designed to control basic functions. Today, HMIs have evolved into sophisticated, context-aware systems that serve as the primary bridge between users and increasingly intelligent machines. Whether embedded in vehicles, Industrial equipment, consumer electronics, or smart infrastructure, modern HMIs must manage a growing array of tasks. These include real-time data vi...
Edge AI — implementing AI models on Edge devices to process algorithms locally rather than in a centralized computing location, like the cloud — has garnered significant attention as one of the fastest developing areas of artificial intelligence. Calculated at roughly $21 billion USD in 2024, the Edge AI market is expected to reach over $143 billion by 2034. This signals a sustained focus across industries on the development of AI-powered Edge systems. New opportunities for Edge AI ...
I took my 15-year-old to her first big concert recently - Roger Waters’ Us + Them tour. If you are wondering where am I going with the blog on automotive technologies and talking about Roger Waters, stay with me for a second. She and I both loved the concert, however, as I drove to and from the concert, I struggled with the stress of congested traffic and getting in and out of the parking lot. During those high-stress moments I imagined how wonderful it would be to have an autonomous car that would pick me up and drop me off and navigate through the mess while I could relax.
Change is constant for embedded video system designers. Over the last few years designers have seen systems that incorporate mobile application processors with an entirely new level of capability, the adoption of new standard interfaces originally developed for the mobile consumer market, and the introduction of a new generation of lower cost image sensors and displays.