Skip to main content

News Flash: NAND is transitioning to 3-D

A couple of weeks ago I saw an article by Peter Clarke on EE Times reporting on the coming of 3-D NAND flash memory. He wrote:

An industry-wide transition for the nonvolatile NAND flash memory technology from memory cells in a 2-D array to strings of NAND transistors integrated monolithically in the vertical direction is now anticipated. These 3-D memories are expected to be arranged as a 2-D array of vertical semiconductor channels with many levels of gate-all-around (GAA) structures forming the multiple voltage level memory cell transistors.

His focus was primarily on a presentation that had been given by Keyvan Esfarjani at the IMEC Technology Forum, which was held in Brussels in late May. Esfarjani is Intel’s VP of technology and manufacturing. He’s also the co-CEO of IM Flash Technologies (IMFT), a joint venture between Intel and Micron Technologies.  For IMFT, 3-D NAND is going to come by way of scaling 2-D NAND.
…Esfarjani, while acknowledging there is a scaling limit for 2-D NAND flash, indicated in one of his slides that 2-D NAND flash can scale to two more nodes at about 15- and 10-nm. The slide showed that the first 3-D NAND generation is likely to be brought up alongside that 15-nm 2-D node. Esfarjani added that 16 layer NAND flash ICs will not be enough to provide an economic benefit. “You need 64 or at least 32 layers,” he said.

While IMFT works on its approach, Toshiba appears to be ahead of the curve.  This isn’t surprising, given that NAND (and NOR) flash memory were first developed at Toshiba over thirty years ago.  No date on when IMFT’s 3-D NAND will go commercial, but it’s anticipated that Toshiba will be in volume production mode sometime in 2015.

What’s Critical Link’s interest here?

For starters, most of our SoMs include an option for NAND flash memory, and customers who take this option do so primarily to hold the file system that embedded Linux relies on.  As embedded Linux permeates further into scientific and medical instrumentation, and even some industrial automation applications, so will the need for NAND. Sure, there are other technologies out there –  like SD, MMC, and eMMC (which is  now becoming popular). Will eMMC continue to grow its adoption into embedded devices, or will the capacities of 3-D NAND compel embedded designers go this route?

 

Ethernet’s 40th Anniversary

This is a big year for 40th anniversaries.

We’ve “celebrated” (or at least acknowledged) the 40th anniversary of the mobile phone, and it’s also the 40th anniversary of Ethernet.  It’s hard to think of any recent developments that so fundamentally changed the way the world works – economically, socially, and politically – more so than these two, especially Ethernet, which has been such a major factor in the availability of the Internet and corporate intranets, ushering us into the digital age.

Ethernet continues to be the standard for on-premise networking. Even manufacturing has adopted Ethernet (with some modifications), which has been a long time in coming. But now they’re using it for the error free, guaranteed delivery with predictable delivery latency that manufacturing applications demand.

While it now seems inevitable that Ethernet would rule, but back in the day, it had competition from Token Ring and Token Bus technology. But while Ethernet was open and standardized, Token Ring and Token Bus were largely proprietary. (Hmmm. Where have we seen this one before? Maybe VCR vs. BetaMax?  Although in that case BetaMax was clearly the superior technology, while in the networking world the edge was held by Ethernet.)

EE Times, by the way, got there just in time to follow the development of the Ethernet (and mobile phones). They observed their 40th late in 2012.

On the less techie front, 2013 is also the 40th anniversary of Deep Purple’s most famous song, Smoke on the Water .

If you clicked on the link – and you should – I’m not putting in a plug here for AT&T Mobile Share – or for Deep Purple, even though this is one of rock’s most classic guitar riffs. I’ve got it in here on behalf of all the middle-aged dads out there.  (I’m pretty sure most of you can identify with this ad.)

———————————————————————————————————————————
The hyperlink above should work, but this URL will take you to there:  http://www.youtube.com/watch?v=KP-IdrVvYOg )

What The Nest Learning Thermostat and Critical Link Have in Common

As an engineer, I’m always interested in tear-downs – dismantling a machine, a gadget, a device to see what’s in its innards. Most engineers, I suspect, start doing this when they’re kids. We want to know how things work – the clock, the motor, the radio – which is how we end up becoming engineers. Since so many items now are electronics-based, rather than mechanical, things look different on the inside than they did in the good old days.  But it’s still plenty interesting.

One of the tear-downs that I particularly enjoyed actually happened over a year ago, and that was of a Nest thermostat. (Here’s the article by Sam Sheffer, which was on The Verge in late December 2011.)

For those who aren’t familiar with the Nest, it’s a “Learning Thermostat” which you don’t need to program to schedule when our HVAC clicks on and off. It learns your habits – when you get home from work and turn up the heat or turn on the AC, what temperatures you set and when, and programs itself. It’s also got WiFi so can be controlled from your iPhone. All this is good for the environment and the pocketbook, eliminating the waste of having the heat running when no one’s home and saving on your heating and cooling bills.

Anyway, what they found inside the Nest was a Texas Instruments ARM Cortex A8, the processor that makes the Nest so smart.

I found this pretty interesting, because our MitySOM-335x is also based on an ARM Cortex A8. It’s not the same one – they use an AM3703, ours is the AM335x family. Our choice was motivated by its fit with our target market, which includes everything from laboratory and medical equipment to test and measurement devices.  These are applications that sometimes require industrial protocols like CAN and PROFIBUS, but more importantly always require Ethernet. The AM335x features dual gigabit Ethernet interfaces, where the AM3703 does not support wired Ethernet.

The AM3703 may be processing overkill for the Nest smart thermostat. All things considered, I suspect that Nest went with the AM3703 because the AM335x may not have been available at the time they were doing their development. The AM335x is available at lower speed grades than the AM3703, so Nest could have gotten a good processor a lower price point – a processor still fully capable of doing what the Nest needs doing. At least for now. Nest may have other capabilities in store for it. I certainly hope so – I just put one in my home and love new features!

Anyway, I really do like to read about tear-downs (and even to do my own every once in a while).

Altera-Based Mity in the Works

Not sure if you saw our recent announcement that Critical Link is developing our first Altera –based module. Our new SoM, the MitySOM-5CSX features the Altera Cyclone V SX-U672, which combines FPGA logic and a dual-core ARM Cortex-A9 processor subsystem.  We’ve added in NOR FLASH and DDR3 RAM memory subsystems. This is a great addition to the Mity family, and is part of our continued commitment to giving our customers choice.

We see this new board as a good choice for high-throughput applications like machine vision, test and measurement, scientific imaging, medical imaging and instrumentation, military/aerospace, and a variety of motor control applications.

Some of the features that support these high-throughput applications:

  • High-Bandwidth System Interfaces

–          Six 3.125 Gbps transceivers

–          PCIe Hard Core

–          Up to 145 I/O, many supporting 875MHz SerDes

–           2 Gigabit Ethernet interfaces

  • High-Bandwidth On-Chip Interfaces

–          102-Gbps HPS-to-FPGA interface

–          102-Gbps FPGA-to-SDRAM Interface

Critical Link has also become a member of Altera’s Design Services Network, and plan on doing more Altera-related SoMs.

Anyway, the MitySOM-5CSX will be available through Arrow Electronics beginning in Q3 2013. Let me know if you want to learn anything more about it.

Get Smart

With the possible exception of humankind, everything these days seems to be getting smarter.

A while back, I did a post about “smart” basketballs, which have embedded sensors that capture all sorts of data that helps coaches diagnose and correct player problems. Then there are cars that parallel park themselves. Watches that let you know you’ve got a message – and maybe even let you respond to it. There’s even a smart trash bin that keeps an eye on what you’re throwing out to see whether you’re sending something to landfill that could actually be recycled.

The other day, I was reading about smart forks, of all things.

My initial reaction was, frankly, smart forks? Sounds like a dumb idea to me.

But then I read a bit about them, and thought a bit about them, and came to the conclusion that a device that helps people slow their eating down so they can lose some weight is not a half-bad idea at all.

The fork can be used to passively track eating habits and automatically sync that information, including duration of meals and frequency of forkfuls, with a smartphone…The device can also be set up for behavior modification, vibrating any time the diner is eating too quickly as a gentle reminder to slow down. (Source: CNN)

Okay, maybe that vibrating feature is a bit too much, and maybe we should all be able to figure out whether we’re eating too much, too fast, on our own, but it’s not the dumbest smart idea out there, that’s for sure.
Smart technology, of course, is not all about gadgets. There are obviously a lot of tremendously important uses that the development of ever more powerful, i.e., smarter (and ever less expensive) sensors is bringing us.

Think of the sensors used in prosthetic devices that make it easier for those who’ve lost their limbs to get back to normal living more quickly. Think about the sensors used in medical devices that let patients go home, while still having their doctor keep an eye on them. Think about sensors used to regulate utility use, making it more environmentally friendly.

Here at Critical Link, our focus is in the complex end of the application spectrum, so you’re not apt to find one of our Mity SoM family used in something like a smart fork. But we’re doing our part of make the world a smarter place. (If only we could do something about humankind…)

Dream Job

As my friends (and wife) are all very well aware, I’m a car guy.I love BMW’s. I follow NASCAR. And Formula 1.

On a recent business trip to Germany, I got to tour the BMW factory in Munich.

I’m also an engineering guy.

It’s what I do. It’s what I love to do.

So it’s not surprising that a recent article over on IEEE Spectrum caught my eye.

Rick Mahurin is a Purdue engineering grad who’s a data acquisition engineer for Walker Racing, where he gets to do things like analyze what’s going on with cars like a $650K Porsche 911 GT3 RSR that was Team Falken Tire/Walker Racing’s entrant last year in the American LeMans Series.

These days, pretty much all cars are high tech, but race cars are high tech on steroids, chocked full of sensors that measure all sorts of things: tire pressure, temperature, engine speed. Rick Mahurin’s car has over 300 of them.

As cars fly by him on the track, he sits in the pit and concentrates on three computer screens, which provide continuously updated plots of data beaming from the Porsche. If he notices any measurements creeping outside their normal ranges, he radios an alert to the crew chief, who is also in the pit but unable to hear anything above the din beyond what is sent to his headphones. (Source: IEEE Spectrum)

Like most engineers, Rick is also a tinkerer, and between races, he’s fine tuning the car’s systems.  In the wake of a crash,

 … he spent a week scrutinizing the sensors on the car to be sure they weren’t damaged. He installed some new electronics for collecting and processing the raw sensor readings as well as junction boxes designed to simplify the wiring of the car’s onboard video system.

His tinkering paid off, and at the next week’s race, the Porsche came in first.

 Silicon and software, as much as rubber and steel, made that win possible, Mahurin notes proudly. “When the systems that you have in place enable you to do your job, and it all comes out the way it’s supposed to, it’s a joy.”

Now I’m not saying I want Rick Mahurin’s job. I like my own brand of combining silicon and software just fine.

But I’ve got to say, for a car-loving engineer, his sure sounds like a dream job.

 

Critical Link at Design West

Last week, I spent three days in San Jose at Design West (the renamed Embedded Systems Conference).

BeanieHat-133

While the show seemed much smaller than it has been in the past, there were still plenty of interesting things to see, including a training session on how to “program your very own wireless mesh networked propeller beanie hat”.  I took a pass on that one – come on, what self-respecting engineer actually needs another propeller beanie cap?

But there were still plenty of things that caught my interest.

Our neighbors at the Atmospheric Science Research Center at SUNY-Albany’s Whiteface Mountain Observatory had an interesting session on building a real-time atmospheric monitor/data acquisition device on a Raspberry Pi. Cool! (Or warm, depending on how you look at things.)

It was an intense couple of days.  We had a kiosk – with a great location – in the TI booth, and most of the interest in DSP was directed our way. Just off the top of my head, however, I’d say that interest in pure DSP was not as high as interest in our MityDSP-L138F, which has both ARM and DSP processors (as well as an optional FPGA), but maybe that’s because that’s what we were focusing on in the booth.

TI had a training area set up in their booth, and we had two opportunities to give a presentation on using the MityDSP-L138F for stereo vision, both of which were well attended.

Special thanks to the folks at TI: to Kanika Carver and Melissa Hancock for inviting us, and to  Maly Chhorn and Sherry Howell for their support during the show.

CL pod in TI Booth

 

 

 

 

 

One More Reason I Wish I’d Taken Touch Typing

While over the years my keyboard skills have gone wfujitsu-keyboard-trackingay beyond simple hunt and peck, there have been a few occasions when I wish I’d listened to Mrs. Crabtree in ninth grade when she advised everyone to take touch typing. One of those occasions was when I saw a piece by Andrew Liszewski on “invisible keyboards” over on Gizmodo a while back.

Fujitsu has a prototype that “uses the tablet’s camera to track your finger movements on a desk, as if you were typing away on an invisible keyboard.” This removes the need to have to surrender so much screen area to the keyboard and/or to carry around a keyboard with you, which more or less defeats the purpose of a tablet.

Since everyone has different shaped, sized, and colored hands, the software has to spend a bit of time learning how a user types before there’s any kind of useful accuracy. But when school’s over, it does appear to be a plausible alternative to a physical keyboard.

I think that this would take some getting used to, especially if, like me, you do occasionally have to look down to see where the @ sign is – but I’ve got to say that this is one of the niftiest applications of machine vision that I’ve seen – even if I never get to use it.  This application appears to be using a single camera, but I think it might perform better with a stereo approach. This would allow for more complex gestures. Maybe you could even have a keyboard in the air!

As for touch typing, who knew that Mrs. Crabtree was right all along?

Meanwhile, not to be outdone – okay, okay, I admit: with the invisible keyboard, we are outdone – Critical Link does some playing around on its own with both machine and stereo vision. Here’s a demo that we did of our stereo vision technology at Photonics West this past February:

Critical Link’s stereo vision demo at Photonics West

ASICs? Souped up FPGAs? You decide.

Rick Merritt had an interesting article over on EE Times a few weeks ago, and it’s a good springboard for any debate on whether FPGAs with cores for communications systems are replacing ASICs in the comms systems world.

Rick’s article came on the heels of an early March announcement from Xilinx in which they talked about new solutions that leverage their programmable FPGAs and ARM-based SoCs and are more flexible and less costly to develop than ASICs. Xilinx sees the action moving away from ASICs and towards FPGAs.

Altera also has lots of comms cores in its portfolio, and, not surprisingly, they’re equally adamant about seeing a shift from ASICs to FPGAs.

There’s some data that Rick cited in his piece that seems to support the Altera-Xilinx position:

International Data Corp reported total ASIC starts declined to 2,313 in 2011, down six percent from 2002 levels. In wired comms, the decline was nearly twice as steep to 442 ASICs in 2011, down 11 percent from 2002.

Weighing in on behalf of fully-integrated circuits, Rick noted that Cisco remains committed to developing ASICs, and the HiSilicon (Huawei’s silicon design division) is going strong.

Critical Link’s approach is, of course, more along the lines of what Altera and Xilinx do. We continue to see strong demand from application developers who don’t want to paint themselves into a corner with an expensive and extended ASIC development cycle, and who produce applications in volumes that are too small to justify that ASIC build (even if they were willing and able to forego the flexibility/customizability that you don’t get with an ASIC).

We’ll play Switzerland here and stay neutral: there’s room for both alternatives.

But I will say that the debate’s an interesting one!

Testing. One, Two, Three Testing

There was an interesting article by Anil Khitolia on EDN a few weeks ago on LTE Stress Testing.  He started out with a fast fact that I found pretty shocking – a claim that 15-20% of smartphones are returned.  Anil attributes this to lack of rugged testing:

Current wireless industry standards require device certification or conformance testing prior to launch. This type of testing requires the device to pass a minimum performance standard that focuses on device functionality, but it does not test the device the way the consumer will ultimately use it. Certification testing guarantees a device meets industry standards, but it tests only one operation at a time and was never intended to establish how well that device works when left running for many hours.

He goes on to talk about the need for stress testing, especially as “LTE has added another dimension and sense of urgency to the equation.” (He also gives a nice example of the difference between a simple conformance checklist test and a stress test that you might find worthwhile. His article is definitely worth a read!)

My initial reaction, which I posted in a comment, was:

Stress testing is one of the important steps in fielding a successful product. With the rapid pace of the constant 2-year upgrade cycle that cell phone customers are addicted to it seems that providers have perhaps been rushing — or skipping all together this important step towards reliability and ultimately customer satisfaction.

For Critical Link, stress testing has always been, well, a critical link in our board development process. Unlike the average smartphone, our system-on modules (SOMs) are expected to have a very long life.  They’re often embedded in complex (and expensive) equipment with a life expectancy of ten, or even twenty, years.  These sorts of applications are pretty much the exact opposite of consumer products, which quickly become throwaway. Scientific instrumentation, industrial automation, and like applications don’t get replaced just because something new and flashy appears on the scene.

For our SOMs, we do many types of design verification testing at the board level, on the prototypes, to make sure that the design is solid and robust. Generally, we run them across the full temperature range in a temperature chamber, running the temp up and down to make sure the board supports the range that it’s rated for and can withstand and function properly through many thermal cycles. We also perform shock and vibration testing, which I wrote about here a couple of weeks back.

If we’re involved in a project at the application stage, where we’re working with a customer to finalize the entire hardware and software package, we also perform stress testing. We accelerate the life cycle of the product, running it for many days, if not weeks, under highly stressful situations and conditions. If the product (and its associated application software), for example, is  expected to receive ten messages per second over Ethernet, we may send 100 or even 1,000 messages per second to try to force any issue with memory leaks and other potential longevity problems to the surface quickly. Once the product passes a full set of functional tests to ensure all the intended requirements are met properly, and the stress testing is complete, the customer will move to production. But that’s once we’ve ensured that the design works under more stressful conditions than it’s ever likely to experience once it’s put into use.