Skip to main content

A look back at 2014

As 2014 winds down, we thought it would be fun to look back at the year and reflect – okay, brag a bit – about our accomplishments.

In January, we announced a new Analog I/O Expansion Kit for the MityDSP L138F, providing support for both low- and high-speed DAC and ADC data converters to our standard I/O Dev Kit for that System on Module.

And speaking of SOMs, we added to the Mity brand by introducing the MitySOM. Our original SOMs were all DSP-based, but that’s changed over the years. We wanted to reflect that change, while also leveraging the brand equity that Critical Link has built with the Mity name.

In other product news, we expanded our MityCAM family, adding new low-light options with the introduction of the MityCAM-B1910F. This is based on Fairchild Imaging’s CIS 1910F sCMOS sensor, and complements the original MityCAM B2521F. We also brought out the MityCAM-C8000, which offers an 8-megapixl resolution, windowing capabilities and onboard processing. The sensor used was developed CMOSIS, which specializes in advanced image sensors.

The year also saw Critical Link achieve ISO 9001:2008 quality system registration, which demonstrates that our customer support, engineering design, and production processes meet the rigorous ISO standards.

In 2014, we also hit the 1,000 mark for registered support users. To keep up with this growth, we expanded our production support staff here in Syracuse. We also expanded our partnerships with our distributors and sales channels, leading to a very healthy growth year for Critical Link.

All in all, it was a very good year.

And so, with many thanks to Critical Link’s customers, partners and staff, we wish everyone a joyous holiday season, and a happy, healthy and Mity new year.

————————————————————————————————————————————–

Critical Link’s blog is taking New Year’s Eve off, but we’ll be back on January 7, 2015.

That is one massive HD display

(Source: demotix.com)
(Source: demotix.com)

I was in New York City over Thanksgiving and while I was there, I couldn’t help but notice the massive new LED screen in Times Square.

Did I say massive?

An article I saw on it in Reuters reported that it’s the “world’s largest high-definition video display…with nearly 24 million LED pixels.” (There are larger display screens at various locations around the world, including one in Las Vegas, but this is the biggest HD one.) It’s “longer than a football field and eight stories high”, and was turned on for the first time the day before Thanksgiving. So I was pretty much there for its introduction.

When I saw it, Google was doing the advertising, but, not surprisingly, I’m a bit more interested in the technology than what’s being advertised.

The technology is a Diamond Vision display from Mitsubishi Electric Power Products, and:

 …exceeds 4k ultra-high-definition pixel density. The display is a massive 25,610-square-feet, and with a pixel density of 2,368 x 10,048, it will be the highest resolution LED video display in the world of this size. The installation will employ a Mitsubishi Electric Diamond Vision AVL-ODT10 large-scale display which provides true 10mm pixel pitch spacing and uses a 3-in-1 surface mount LED (SMD) featuring Mitsubishi Electric’s revolutionary Real Black™ LED technology. (Source: Mitsubishi press release on BusinessWire.)

Pretty impressive, I’ve got to admit.

But, personally, not quite as impressive as what brought me to NYC over Thanksgiving, which was watching my younger daughter and her bandmates in the Baldwinsville High School Marching Band perform in the Macy’s Thanksgiving Day Parade. Here’s the link to the YouTube of the band doing a Daft Punk medley, on nationwide TV. Couldn’t be prouder!

Heads-Up Technology

Between last week’s post and this one, it’s looking like December is starting out as Automotive Month here at Critical Link.

This is really not surprising, as so many cool embedded and vision systems applications have been happening in the transportation world. One of the more interesting applications is Heads-Up Display technology (HUD).  HUD is a transparent way of showing information on the screen in front of someone, without having them have to look down. Originally developed for military pilots, it’s now making its way into cars. With HUD, drivers will be able to see dashboard information (and see through it) on their windshield, rather than have to take their eyes off the round to look down to find it.

The video linked here is from our friends and partners at TI, who have developed some very interesting HUD technology. In their words:

DLP technology enables augmented reality head-up displays that offer quality, high brightness, wide field of view and the flexibility to increase the virtual image distance, all in the driver’s natural line of sight.

A few years back, I test drove a car (for car mavens, it was a BMW M5, and I did my driving at the

BMW Performance Center – a really cool experience, by the way).  I couldn’t see the HUD at all – until the instructor asked if my sunglasses were polarized. Bingo! The polarization, unfortunately, wiped out the image.

Based on the video, there’s an entirely new level with the amount of detail that can now be displayed with DLP vs. what I saw a few years back (once I took my sunglasses off). And the display now appears to be way out in front of you, on the front of the car bumper, so you can really keep your eyes up and on the road.

Wonder if they’ve solve the polarization problem?

Anyway, I’m looking forward to having a car with HUD. Maybe next time…

 

————————————————————————————————————————————

For more from TI, here’s the link to their DLP Technology page.

Finding Out the Hard Way

Because Critical Link does so much with vision systems, I’m always interested in camera applications.

On a recent European trip, I noticed that they have a lot of traffic cameras on their highways.

This is something that is coming into use in the States –often to identify those who run red lights – but which is more prevalent overseas. Anyway, while driving from Germany to Zurich Airport, which is just over the German border, I was saying to myself that it was pretty cool to see all those traffic cameras pointed at me – so cool that I tested one out.

As anyone who’s driven in Germany knows, it’s pretty easy to gun it a bit on the Autobahn, which I apparently managed to do. (All made even easier by it being 4 a.m.)

Anyway, we were in a tunnel and saw a red flash go off.

Uh-oh.

My wife, who was traveling with me, asked what it was, and I had to tell her that it meant that I’d just gotten a speeding ticket.

I haven’t heard back yet from the rental car company, but I’m pretty sure that they’ll be piling on a hefty fee to whatever the speeding fine is.

Fortunately, I was speeding in Germany rather than in Switzerland, where the speeding fines can be an order of magnitude higher. If you’re practicing extreme speeding, the Swiss will even confiscate your car. I was not, of course, extreme speeding, so the fine will likely be somewhere in the 15 to 25 Euro range.

For all my trying to get to the airport in time – one good thing about a no-officer-involved ticket is that you’re not pulled over on the side of the highway while they run your license – our plane ended up having a mechanical issue (faulty bleed valve), so we ended up with a little extra taste of Zurich. No driving involved!

 

Cape Vincent Lighthhouse

CapeVincent-lighthouseThe other day, I happened across an article in Vision Systems Design on how CMOS imagers are trying to increase the dynamic range they support so that they can capture images under both day and night light conditions. The article highlights some of the techniques sensor manufacturers are using to increase dynamic range, and references both Fairchild and CMOSIS sensors (among others). Those mentions caught my interest because those sensors have been integrated into the MityCAM platform.

And although I can’t always make a connection between Critical Link products and my personal life, while reading about dynamic range I could definitely make one.

What came to mind was a picture I took on one of the first vacations my husband and I ever took as a couple.

It was one of those spur of the moment trips you can only take pre-kids, and, because if was so unplanned, we couldn’t get a place in a state park. So we ended up in a pretty terrible campsite – definitely not the kind of place we wanted to hang around all day. So we took a side trip to the town of Cape Vincent, on the St. Lawrence River, at Lake Ontario, which – unlike our campsite – is just lovely. Cape Vincent also has a lighthouse, and the sun was falling just behind it, and I decided to take a picture of it. So even though I didn’t have much of a camera – and it was a film camera, so I wasn’t able to keep working the shot until I got it right – I managed to take a wonderful picture. Over the years, my lucky shot became one of my favorites.

Unfortunately, over those years we accumulated the usual avalanche of post-kid stuff, and while I thought I knew where the picture I was thinking of was, it wasn’t there.

What you’re missing out on was just a lovely shot: the sun was falling just behind the lighthouse, in such a way that you could just see a piece of it where the roof meets the tower. It was very bright right there, but it didn’t completely darken the rest of the lighthouse and you could still see most of the features.

Anyway, knowing what I know now about dynamic range, I can’t believe I was able to take such a beautiful picture with whatever junky camera I had at the time, which certainly did not have any dynamic range features. It was purely dumb luck!

But, alas, I couldn’t find the photo I was looking for, but did find a later Cape Vincent lighthouse shot. It’s almost, but not quite, what I was looking for, but I’m using it anyway included it anyway.

Meanwhile, tomorrow is Thanksgiving, and those of us at Critical Link have much to be thankful for, including the wonderful customers who bring us such interesting work to do.

From the Critical Link family to yours, Happy Thanksgiving.

Engineering Support Hits –and Passes- a New Milestone

More and more often we are hearing from customers that the responsiveness and support provided by our engineering team is head and shoulders above most others in the industry. To be honest, we never really understood what the big deal was. There are a lot of good engineering teams out there who take pride in their work and customer relationships. And while I always thought we were in the ranks of one of those, lately I’m starting to think there may be more to it.

Part of what’s got my attention is that our engineering support site has recently passed the 1000 registered users mark. This wasn’t a milestone we ever anticipated reaching when we first launched the site a few years ago. And the trend isn’t slowing down. In the time it took me to get this post together, the 1000 user mark is already in our rearview mirror.

Now, 1000 users may not seem like a big number to some companies, but most of what is available on our site does not require registration, so it’s a significantly smaller subset of folks who sign up. Without a username and password, engineers can review forum posts for previously asked questions and access product documentation on wikis, including support for build flow, software, FPGA/VHDL, and hardware design. Sample projects for our MitySOM and MityDSP products are also made available without registration.

So why would anyone need to register at all? Two main reasons: (1) to interact with our engineers directly by posting on the forums, and (2) to access additional product documentation, like baseboard schematics and SOM pin maps.

Another important thing we’ve learned? There are quite a few companies who attach strings to their engineering support. In some cases people are only allowed access once they have made a purchase, and then their support is limited to 1 year post-purchase. We believe it’s important for companies to get a sense of who they’re working with before they buy, and luckily for us, looking at recent trends, it seems like our customers agree!

The Internet of Animals

I’m sure you’ve heard of Fitbit, the activity tracker that monitors and reports on how many steps you take in a day.

Well, now there’s something similar for dogs. Fitbite might have been a better name, but that probably would have caused some trademark infringement problems. Not to mention some folks might not like the association with dog bites man and all that. In any case, the product is called the Whistle Activity Monitor:

… a health tracker for your dog. It attaches to any collar and measures your dog’s activities, giving you a new perspective on day-to-day behavior and long-term trends. (Source: Whistle)

It doesn’t just measure that activity – or lack of: let’s face it, most of these guys spend a lot of time lolling around resting – but it sends the info to your iPhone. As with Fitbit, you can set a goal that you want your pup to meet, and take remedial action if he doesn’t seem to be getting there.

Whistle also offers a number of recipes/applets: You can post your dog’s info to a spreadsheet (which seems a bit obsessive). You can have a walk put on your calendar if your dog isn’t reaching the exercise goal for the day.

Me, I always like to make sure that our dog gets plenty of good exercise, so I won’t be getting one of these just yet. In any case, I do find a GPS tracking system far more useful for our little escape artist!

All part of the Internet of Things that just keeps growing and growing, with every type of application imaginable.

I’ll have to ask my daughter, who’s in veterinary school, what she thinks of Whistle.

Meanwhile, I want to end with their motto:  Improving the lives of pets, as they do ours.

Certainly true at our house!

Schrödinger’s cat (sort of)

This week, Critical Link is in Stuttgart, Germany, for Vision, the International Machine Vision Trade Fair.

So while I’m always thinking about things related to vision, this week I’m really thinking about things related to vision. And while I was thinking about things related to vision, I was also catching up on my reading, and came across an interesting article last August in OPLI, Quantum Physics Enabled Revolutionary Imaging Method.

The article talks about an experimental quantum physics method of imaging objects in extremely low/no light situations, and in this case a situation where the sensor never actually “sees” the object at all, but is still able to image it.

“In general, to obtain an image of an object one has to illuminate it with a light beam and use a camera to sense the light that is either scattered or transmitted through that object. The type of light used to shine onto the object depends on the properties that one would like to image. Unfortunately, in many practical situations the ideal type of light for the illumination of the object is one for which cameras do not exist.

“The experiment published in Nature this week for the first time breaks this seemingly self-evident limitation. The object (e.g. the contour of a cat) is illuminated with light that remains undetected. Moreover, the light that forms an image of the cat on the camera never interacts with it.” (Source: OPLI article linked above.)

The suspicion is that they chose a cat (rather than, say, a dog) in a nod to the Schrödinger cat paradox which is that, if you don’t have any outside information one way or another, a cat in a closed box could be both dead an alive at the same time.

Similarly, the new imaging technique relies on a lack of information regarding where the photons are created and which path they take.

We don’t quite get involved in the “no information” scenarios like this experiment, but two of our newest MityCAMs (the MityCAM-B2521F and MityCAM-B1910F) incorporate sCMOS sensors from Fairchild Imaging that offer superior performance in low light applications. If you’re at Vision, we’re demo-ing the MityCAM-B2521F in a low light set up, showing its imaging capabilities at various illumination levels: 1.0 lux, 0.1 lux, and .01 lux. For reference 0.1 lux is less illumination than would be present with a full moon on a clear sky night. A measurement of .01 is getting closer to a moonless night sky. (Almost, if not quite, Schrödinger’s cat-style “no information”.)

 

——————————————————————————-

Copyright credit for the photo goes to Patricia Enigl, IQOQI

What with Halloween soon upon us and all…

What with Halloween soon upon us and all, I got to thinking about coming up with a topic that’s related to both Trick or Treat and embedded electronics. I’m sure that folks can think of lots of different connections – other than the obvious one of dusting off an old slide rule, finding a pocket protector, and putting some adhesive tape across the bridge of your glasses and going as an engineer from back in the day (i.e., back before people realized how cool our profession is!). But the one I came up with is the bat deaths associated with wind farms.

Believe it or not, this is a big deal – lots of bats (and birds) are being killed by flying into wind turbines, and changing air pressure in and around wind farms can also have a really negative impact on bats. (Their lungs can explode.) It’s an issue that’s getting more attention as we look for energy sources that are alternatives to fossil fuels.

Last winter, there was an interesting article by Roger Drouin on Grist that listed “Eight Ways Wind Power Companies Are Trying to Prevent Deadly Collisions”.

Some of the ways don’t have a lot to do with Critical Link’s line of work – like find places to locate wind turbines that aren’t on bat and bird migratory paths, or painting the blades a color that doesn’t attract the sort of insects that bats eat. But other ideas would be the kind of applications that we get involved in: using radar to detect to identify when bat colonies are approaching, and slowing the speed down; and strike detection that would also slow or shut things down once a turbine had been hit.

We tend to think about the positive environmental impacts of alternative energy, and not on some of the problems that go along with it.

Anyway, something to think about and, since it is about bats, this post is at least vaguely related to Halloween.

So Happy Halloween to everyone.

My kids are beyond the age where they go Trick or Treating, so no goody bags to raid in my house – other than the bowl of candy we have ready for our guests. I’ll have to see what’s around the Critical Link kitchen. (There is always something.)

The Human Brain Project

A couple of weeks back, I saw an article on a very interesting topic over on EE Times.

Like a Manhattan Project, resources are coming together for the big push to simulate the human brain. Personnel on European Union (EU)’s Human Brain Project reported their progress toward the primary directive — an artificial brain by 2023 — at the annual HBP Summit at the University of Heidelberg in Germany, yesterday, September 29.

The 10-year-long Human Brain Project, funded to the tune of $1 billion euro (US$1.3 billion) by the European Commission Future and Emerging Technologies as one of its “Flagship Programs,” aims to simulate the entire human brain on supercomputers first, then build a special hardware emulator that will reproduce its functions so accurately that diseases and their cures can be tried out on it. Ultimately, the long-term goal is to build artificial brains that are inexpensive enough to outperform traditional von Neuman supercomputers at a fraction of the cost. (Source: EE Times)

In the words of the Human Brain Project (HBP) itself – that is, they would be the Human Brain Project itself actually had words, rather than use the words of actual humans:

Understanding the human brain is one of the greatest challenges facing 21st century science. If we can rise to it, we can gain profound insights into what makes us human, build revolutionary computing technologies and develop new treatments for brain disorders. Today, for the first time, modern ICT has brought these goals within reach. (Source: Human Brain Project)

According to the EE Times article, HBP reports that it’s making good initial progress in terms of pulling together researchers and setting up the systems that will enable those researchers to collaborate. Some starter projects have also begun. I would like to point out that, for now at least, all those researchers are human, as are the folks responsible for creating and deploying all the technology that will be used in the project. Not to mention any bricks and mortar, web site development, etc. So humans haven’t been replaced quite yet!

The next phase of the project will be compiling data on brain functioning and developing “the necessary infrastructure for developing six ICT platforms.” These platforms, which will be served by supercomputers, will include a data depository, a brain simulation platform “where simulation algorithms of the various brain components will be assembled”, and a platform “that mimics the various functions of the brain.”

Reading through the comments on the article, you can see that people are intrigued, excited, and maybe a bit worried that, when a computer-based human brain is created, there won’t be much left for us to do. On the excited side, there’s the happy expectation that, once free of having to take care of tasks that computers can now perform, we’ll be freed up to enjoy more leisure, spend our time more creatively, tend more to our emotional lives…

I guess you can say that I’m a bit of all of the above.

There’s science, engineering, and technology at play. So of course I’m interested.

But I also like my work, which is challenging and creative.

We’ll see what happens as 2023 nears and the project is closer to completion.

In the meantime, it all reminds me of a show that used to be on, The Worst Jobs in History, a British show that focused on some really terrible jobs that the industrial revolution pretty much did away with. The one that sticks in my mind is that of fuller. A fuller’s job was to stand in vats of urine all day, stomping on wool cloth to get rid of its dirt and grease.

Something to think about when you’ve had a bad day at work…