Skip to main content

New Qualcomm System on Modules at Embedded World: Power and Performance of Qualcomm, Now for Broad Market Applications

Critical Link's MitySOM-QC6490 system on module will be running live AI vision-based demos in booth #3065 at Embedded World. The demos leverage open source and proprietary AI models performing single-camera depth estimation, facial recognition, pose estimation, and more.Add Critical Link booth #3065 to your Embedded World must-see list and check out a live demonstration of AI at the edge using our latest family of Qualcomm-based system on modules!

The MitySOM-QC6490/QC5430 family is based on Qualcomm® Dragonwing™ QCS6490 and QCS5430 processors. These system on modules deliver a combination of multi-core CPU, GPU, and VPU processors, a host of memory options, on board power supply, and a range of high-speed interfaces all in a 45mm x 45mm package. Developers also gain access to the Qualcomm AI Hub, a toolkit offering examples for speech recognition, image classification, text generation, and many other common AI applications.

The new MitySOM-QC6490 system on module will be running a series of live AI vision-based demos in Critical Link’s booth at Embedded World. The demos leverage open source and proprietary models performing single-camera depth estimation, facial recognition, pose estimation, and more. Critical Link will be meeting with interested companies about real-world applications of these capabilities and accelerating development to achieve first-to-market status using the MitySOM-QC6490.

All of Critical Link’s MitySOM solutions are designed for high performance industrial applications. Customers can count on long term availability, quality, and design maintenance. Our engineering and production teams are based in the US, and we provide a level of support that is unmatched in the industry. Learn more about what sets Critical Link apart: https://www.criticallink.com/the-critical-difference/.

Embedded World North America is November 4th to 6th in Anaheim, CA. Register today using Critical Link’s promo code CRITIC25, and be sure to visit us at booth #3065. While there, scan your badge for a chance to win a MitySOM Development Kit of your choice.

Not going to Embedded World? Visit www.criticallink.com to explore our available products and engineering services, or reach out to us directly anytime at info@criticallink.com to discuss how we can support your next project.

New Altera Agilex 5 FPGA Solutions at Embedded World: Power/Performance Optimized Solutions for Industrial AI

Visit Critical Link booth #3065 at Embedded World North America this November to learn about our latest family of FPGA system on modules featuring Altera AgilexTM 5!

The MitySOM-A5E family of products includes three different options, each with an array of features to best-fit new customer designs. The MitySOM-A5E offers up to 656KLE FPGA fabric, dual core ARM Cortex-A55 and Cortex-A76, 24 transceivers, dual banks of RAM, and a wide range of user-friendly interfaces. Designs with size or budget constraints can take advantage of the MitySOM-A5E Mini, which features a smaller footprint and lower density devices for a cost optimized solution. Critical Link also offers an Agilex 5 single board computer that provides ultimate flexibility and speed.

Critical Link’s Agilex 5 FPGA family of products are targeted for broad market adoption, particularly for applications needing high performance, lower power, and small form factors at the edge. This includes industrial equipment, test and measurement, communications, video and broadcast, medical devices, and other embedded instrumentation.

Critical Link’s long history of designing industrial-class embedded solutions means customers can count on long term availability, quality, and design maintenance. Our engineering and production teams are based in the US, and we provide a level of support that is unmatched in the industry. Learn more about what sets Critical Link apart: https://www.criticallink.com/the-critical-difference/.

Embedded World North America is November 4th to 6th in Anaheim, CA. Register today using Critical Link’s promo code CRITIC25, and be sure to visit us at booth #3065. While there, scan your badge for a chance to win a MitySOM Development Kit of your choice.

Not going to Embedded World? Visit www.criticallink.com to explore our available products and engineering services, or reach out to us directly anytime at info@criticallink.com to discuss how we can support your next project.

Join Critical Link at Arrow University St Louis

We’re excited to get to St Louis this week for another Arrow Electronics event! We’ll be talking about Asymmetric Multiprocessing and AI at the Edge, plus showing our latest MitySOM system on modules. It’s not too late to register – sign up today at https://forms.office.com/r/FpgyLsh0Lb

Can’t make the event? We’re always available to talk with you or your team, and can share our AMP and AI presentation virtually or in person. Just contact us at info@criticallink.com.

Talking AMP and AI at Arrow University in Pittsburgh & Cleveland

Thank you to the Arrow Electronics team in Pittsburgh and Cleveland for having Critical Link at their annual Arrow University events this week! The turnout in these regions is always top notch, and we love talking with local engineers about Leveraging Asymmetric Multiprocessing and AI at the Edge for Scalable Embedded Systems.

Missed the event? We’re always available to talk with you or your team, and can share our AMP and AI presentation virtually or in person. Just contact us at info@criticallink.com.

Discover the Future of FPGA Innovation with Altera & Critical Link

Join us on February 11th, 2025, at 9 AM PT for an exclusive LinkedIn Live event that will redefine the possibilities of FPGA technology. Hear from industry leaders and innovators as they share insights, strategies, and success stories driving the future of tech.

Gain insights into the latest FPGA advancements, Altera’s strategies, and how we are shaping the FPGA landscape with our customers and partners.

– Accelerating Innovation with Altera: Hear from Altera CEO Sandra Rivera as she shares Altera’s vision and how investments in cutting-edge silicon and software will drive sustainable, long-term success as an independent company.
– Spotlight / Success Stories: Discover how industry leaders like Tejas Networks, DigiKey, and Critical Link leverage Altera’s solutions to innovate and achieve breakthrough results, demonstrating Altera’s commitment to driving customer innovation through focus, quality, and execution.
– Altera Partner Program: Learn more about Altera’s new partner program and get to market faster with end-to-end support and resources from Altera’s broad partner ecosystem.

Speakers:
– Sandra Rivera, Chief Executive Officer of Altera
– Dave Doherty, President of Digikey
– Arnob Roy, Co-founder, Executive Director, COO of Tejas Networks
– Tom Catalino, Co-founder, Vice President of Critical Link

 

REGISTER TODAY!

AI-driven robots that can help folks live more independently? Bring it!

A couple of weeks ago, I posted about attempts to help alleviate one of the downsides of AI: its power consumption, and the environmental impacts that comes with.

What I didn’t mention in that post is that there are small ways in which we can help reduce AI-related power consumption, and that’s by not buying into any “AI-for everything” madness, in which AI is used even when it’s not needed and/or adds limited value.

As consumers, do our home appliances have to be all that smart? Do we really need AI to tell us what’s in the fridge when we can just open the door and look? Do we need to deploy AI for every Google search we do? Shouldn’t we be willing to “wait” an extra second for a non-AI search that may well yield superior results, by the way. (Public Service announcement: if you don’t want Google to produce an AI Overview for every search you do, use the Web option when you search.)

But there are plenty of applications where AI can and should be used in the home. And one was chronicled last spring in MIT’s Technology Review.

More than twenty years ago, Henry Evans – then only 40 years old – suffered a major stroke and ended up a quadriplegic who was unable to speak. Over the years, he was able to use his eyes and a letter board to communicate, but in most day-to-day situations, Henry has to rely on caregivers.

Then robotics came on to the scene. Sort of.

In 2010, Henry saw a demo of a primitive “metal butler,” and asked himself why something like that wouldn’t work for him.

There was a solid reason why not. While engineers have made great progress in getting robots to work in tightly controlled environments like labs and factories, the home has proved difficult to design for. Out in the real, messy world, furniture and floor plans differ wildly; children and pets can jump in a robot’s way; and clothes that need folding come in different shapes, colors, and sizes. Managing such unpredictable settings and varied conditions has been beyond the capabilities of even the most advanced robot prototypes.

But thanks to AI, that may be about to change, giving robots the opportunity to advance beyond the skills that are driven by purpose-built software and to acquire new skills and figure out new environments faster than they ever could before.

Henry Evans has already been working with experimental robots that are letting him take care of tasks like brushing his hair. “Stretch,” the robot Henry is currently working with – the brainchild of Georgia Tech professor Charlie Kemp – goes beyond specific-purpose tasks, like hair-brushing, and lets users “plug in their own AI models and use them to do experiments.” As Stretch learns more, it can do more.

With AI-software-powered robots, robots will be able to acquire new skills automatically rather than have to solve each problem independently and by having to plot each element in “excruciating detail.” Not that these painstakingly acquired skills aren’t impressive. Who hasn’t marveled at a video of humanoid (or dog-oid) robots – climbing stairs, boogeying, opening doors? Now research is moving beyond pure physical dexterity and are now experimenting with “building ‘general purpose robot brains’ in the form of neural networks, and tapping generative AI in ways that go beyond “the realm of text, images, and videos and into the domain or robot movements.” This will allow robots to quickly adapt to new environments and quickly learn new tasks.

The cost (and size) of robots will come down, their utility will go up, and for folks like Henry Evans, the world will open up. It already is. While Stretch is imperfect – it’s buggy and bulky – it’s a declaration of independence. “All I do is lay in bed, and now I can do things for myself that involve manipulating my physical environment.” These include playing with his granddaughter, holding his own hand of cards, and eating a fruit kabob.

I’ve always maintained that one of the very best things about being an engineer is doing work that really does improve people’s lives.

AI-driven robots that can help folks live more independently? Bring it!

 

_________________________________________________________

Source of Image (Henry Evans giving his wife Jane a rose): IEEE Spectrum

 

Making AI More Energy Efficient

Whatever your feelings are about AI – It’s all great! It’s going to kill us all! You gotta take the bad with the good! It all depends! I’m not quite sure – yet! – most of us recognize that while AI is growing as a force, and will revolutionize entire industries, it does consume an awful lot of energy, leading to concerns about whether it is environmentally sustainable.

Consider these observations, based on the work of the International Energy Agency (IEA)

One of the areas with the fastest-growing demand for energy is the form of machine learning called generative AI, which requires a lot of energy for training and a lot of energy for producing answers to queries. Training a large language model like OpenAI’s GPT-3, for example, uses nearly 1,300 megawatt-hours (MWh) of electricity, the annual consumption of about 130 US homes. According to the IEA, a single Google search takes 0.3 watt-hours of electricity, while a ChatGPT request takes 2.9 watt-hours. (An incandescent light bulb draws an average of 60 watt-hours of juice.) If ChatGPT were integrated into the 9 billion searches done each day, the IEA says, the electricity demand would increase by 10 terawatt-hours a year — the amount consumed by about 1.5 million European Union residents. (Source: Vox)

Given stats like this, and the fact that demand for AI – and the power it consumes along the way – is rapidly growing, it’s no wonder that engineers and researchers are focusing on ways tamp down/slow down AI’s energy demands.

Writing recently in the EE Times, Simran Khoka notes that “data centers, central to AI computations, currently consume about 1% of global electricity—a figure that could rise to 3% to 8% in the next few decades if present trends persist,” adding that there are other environmental impacts that AI brings with it, such as e-waste and the water usage required for data center cooling. She notes that IBM is one of the leaders in creating analog chips for AI apps. These chips deploy phase-change memory (PCM) technology.

PCM technology alters the material phase between crystalline and amorphous states, enabling high-density storage and swift access times—qualities essential for efficient AI data processing. In IBM’s design, PCM is employed to emulate synaptic weights in artificial neural networks, thus facilitating energy-efficient learning and inference processes.

IBM is not alone. Khoka cites a couple of the little guys: Mythic, which:

…has engineered analog AI processors that amalgamate memory and computation. This integration allows AI tasks to be executed directly within memory, minimizing data movement and enhancing energy efficiency.

She also writes about Rain Neuromorphic which is developing chips “process signals continuously and perform neuronal computations, making them ideal for creating scalable and adaptable AI systems that learn and respond in real time.”

Applications well suited to analog chips include edge computing, neuromorphic computing, and AI inference and training.

A principal challenge that switching to analog chips presents is ensuring that they have the same precision and accuracy that digital chips yield. Another hurdle is that, at present, the infrastructure behind AI systems is digital.

It’s no surprise that MIT is keeping its eye on ways to reduce the energy consumption of voracious AI models. MIT’s Lincoln Lab Supercomputing Cener (LLSC) is finding that by capping power and slightly increasing task time, energy consumption of GPUs can be substantially reduced. The trade-off: tasks may take 3 percent longer, but energy consumption is lowered by 12-15 percent. With power-capping constraints in place, the Lincoln Lab supercomputers are also running a lot cooler, decreasing demand placed on cooling systems – and keeping hardware in service longer. (And something as simple as running jobs at night, when it’s cooler, or in the winter, can greatly reduce cooling needs.)

LLSC is also looking at ways to improve how efficiently AI models are trained and used.

When training models, AI developers often focus on improving accuracy, and they build upon previous models as a starting point. To achieve the desired output, they have to figure out what parameters to use, and getting it right can take testing thousands of configurations. This process, called hyperparameter optimization, is one area LLSC researchers have found ripe for cutting down energy waste.

“We’ve developed a model that basically looks at the rate at which a given configuration is learning,” [LLSC senior staff member Vijay] Gadepally says. Given that rate, their model predicts the likely performance. Underperforming models are stopped early. “We can give you a very accurate estimate early on that the best model will be in this top 10 of 100 models running,” he says.

Jettisoning models that are slow learners has resulted in a whopping “80 percent reduction in energy used for model training.”

Whatever your feelings about AI, it’s comforting to know that there are plenty of folks out there trying to ensure that AI’s power consumption will be held in check.

 


Image sourc: Emerj Insights

What’s brewing with AI?

Not that I’ve given it all that much thought to it, but if I’d been asked, I don’t think that I’d have put brewing beer very high on the list of candidates for AI involvement.

Not that the beer industry is any stranger to using up-to-date technology. It’s widely used in brewing’s production and logistics processes. But making beer that tastes better? I would have said that this is more art than science. Sure, the major market share industry players with the more widely-known and consumed brands are more focused on the “science” parts of production and logistics – after all, a Corona’s a Corona and a Bud’s a Bud. And the microbrewers (not to mention the homebrewers) would come down more on the “arts” side, using trial and error to come up with the ideal mix.

Of course, even Corona and Budweiser are always introducing new products, and whether you’re one of the big guys or one of the little guys, creating a beer that tastes good isn’t easy. Figuring out whether – to borrow from an ancient Miller ad – a beer tastes great and/or is less filling can involve drafting (and educating) employees and “civilian” beer drinkers to act as taste testers for their products. But, as a recent MIT Technology Review article said, “running such sensory tasting panels is expensive, and perceptions of what tastes good can be highly subjective.”

Enter AI.

Research published in Nature Communications described how AI models are being used to find not only how consumers will rate a beer, but also how to make a beer that’s better tasting.

This wasn’t an overnight process. Over a five-year period, researchers analyzed the chemical properties and flavor compounds in 250 commercial beers.

The researchers then combined these detailed analyses with a trained tasting panel’s assessments of the beers—including hop, yeast, and malt flavors—and 180,000 reviews of the same beers taken from the popular online platform RateBeer, sampling scores for the beers’ taste, appearance, aroma, and overall quality.

This large data set, which links chemical data with sensory features, was used to train 10 machine-learning models to accurately predict a beer’s taste, smell, and mouthfeel and how likely a consumer was to rate it highly.

The result? When it came to predicting how the RateBeer reviewers had rated a beer, the AI models actually worked better than trained tasting experts. Further, the models enable the researchers “to pinpoint specific compounds that contribute to consumer appreciation of a beer: people were more likely to rate a beer highly if it contained these specific compounds. For example, the models predicted that adding lactic acid, which is present in tart-tasting sour beers, could improve other kinds of beers by making them taste fresher.”

Admittedly, having lactic acid in a beer doesn’t sound all that appealing. But if the beer tastes fresher, well, just don’t read the fine print on the ingredients list.

One area where they anticipate the AI approach will prove particularly effective is in the development of non-alcoholic beers that taste as good as the real thing. This will be great news for those who want to enjoy a beer without having to consume any alcohol.

There are other instances of AI being used in brewing. Way back in 2016, a UK AI software startup IntelligentX, came out with four beers based on their Automated Brewing Intelligence algorithm. The release of Amber AI, Black AI, Golden AI, and Pale AI caused a brief flurry of excitement as the first AI developed beer. Unfortunately, it looks like none of them made much of an impact in the beer market. When I searched for them, I couldn’t find any references beyond 2019.

Maybe the models that the Belgian researchers produced will have more luck creating a successful AI beer.

——————————————————————————————

The full research report from Nature Communications can be found here.

Critical Link Introduces Agilex 5 SoC FPGA Solutions

Syracuse, N.Y. – April 30, 2024 Critical Link, LLC, a leading US-based manufacturer of FPGA, DSP, and CPU-based System on Modules, is pleased to announce new embedded solutions around the AgilexTM 5 SoC FPGA E-Series from Altera®. Critical Link is developing two product families around the Agilex 5 SoC FPGA E-Series: a single board computer and a system on module (SOM) family.

The MitySBC-A5E single-board computer was developed as part of the Agilex 5 SoC FPGA Early Access Program and will be the first to market. The MitySBC-A5E features a 32mm x 32mm Agilex 5 SoC FPGA E-Series with 656K LE FPGA fabric, dual-core Cortex-A55, dual-core Cortex-A76, PCIe 3.0, and 24 transceivers up to 17Gbps. The board includes 8GB LPDDR4 for the HPS, 8GB LPDDR4 for the FPGA, 64GB eMMC, microSD, and QSPI NOR for configuration. A rich set of interfaces, including 8 MIPI x4 lanes, 2.5G Ethernet, FMC, USB-C & USB 2, among others, make this a powerful solution for embedded product development teams working on next-generation industrial performance applications.

The MitySBC-A5E will be available as a development kit as well as a production-suitable single-board computer for customers interested in achieving first-to-market advantages with the high-performance, low-power Agilex 5 SoC FPGA. The product datasheet and other documentation are available today, and customers will benefit from Critical Link’s engineering and application support for the life of their product.

“Critical Link has been partnering with Altera and Intel for more than 10 years, helping customers reach the market fast with next-generation products based on the latest FPGA technology,” says Tom Catalino, Vice President and Founder of Critical Link, LLC. “We are excited to lead the next wave of FPGA-based designs and bring the Agilex 5 SoC FPGA power and performance advantages to our customers.”

Following the introduction of the single board computer, Critical Link is bringing the MitySOM®-A5E family of system-on-modules to market later this year. The MitySOM-A5E family will offer a wide range of FPGA densities, memory configurations, optional transceivers, and temperature ranges all in a compact 51mm x 71mm (2.0” x 2.8”) form factor to fit most applications. These modules are designed for long-term availability and support, meaning customers can confidently design them into long lifespan products in the test & measurement, medical/scientific, defense, and energy/utilities industries. 

Prototypes for the MitySBC-A5E will be available in Q2 2024, with production in early 2025. System on Module prototypes are expected later this year with production to start in 2025.

 

For more details on the MitySBC-A5E Single Board Computer, visit: https://www.criticallink.com/product/mitysbc-a5e-single-board-computer/.

For more details on the MitySOM-A5E System on Module family, visit: https://www.criticallink.com/product/mitysom-a5e/.

 

ABOUT THE COMPANY:

Critical Link, LLC (Syracuse, NY www.criticallink.com), founded in 1997, develops system on modules (SOMs) for electronic applications. Our MitySOM® and MityDSP® families incorporate the latest FPGA, DSP, and CPU technologies, and are designed for long product lifespan and performance in the field. We supply OEMs in a wide range of industries including manufacturing, medical, scientific, defense, and energy/utilities. We ship worldwide and are franchised with many of the top electronics distributors. 

Critical Link, LLC, is privately held and is ISO 9001:2015 Registered by SRI Quality System Registrar. Critical Link is a Gold-level member of the Intel® Partner Alliance.

 

*Altera Agilex 5 FPGA D-series (A5D031, mid-speed grade) vs. AMD/Xilinx Versal (VM1102, -3HS speed grade) at 90% utilization, 600-Mhz, using vendors power estimator calculators.

Altera, the Altera logo, and other Altera marks are trademarks of Altera or its subsidiaries.