In Japanese, the Hayabusa, which means “peregrine falcon,” is a magnificent bird of prey with many distinct natural advantages. Their eyesight exceeds that of humans, boasting the fastest vision in the animal kingdom. While humans register still images at roughly 25 frames per second, the Hayabusa can recognize close to 130 frames per second. Like most falcons, the Hayabusa has two foveae in each eye compared to humans, who only have one. This is why the Hayabusa has a superior hyper vision. In addition to better clarity and recognition, this increased eyesight allows the falcon to hunt day and night. We named our image sensor platform after its natural counterpart—the Hayabusa image sensor platform for all of these reasons.
Our team wanted to draw a correlation between this magnificent animal and its sensing capabilities to this innovative sensor platform they had developed. Not only is the Hayabusa the animal with the highest visual recognition, but it is also the fastest on the planet that thrives in fierce competition. With all these facets, the Hayabusa image sensor platform comprises a series of devices, differing by megapixels, to bring together a cost-effective product and provide optimal performance in any environment.
To learn more about this innovative image sensor platform, gain insight from Radhika Arora, senior director of product marketing, Automotive Sensing Division, who explains what this platform is capable of and design challenges along the way.
Q: What is the Hayabusa Image Sensor Platform?
A: The Hayabusa image sensor platform is a family of products that encompasses three sensors, the AR0147AT, AR0233AT and AR0323AT, ranging from 1.3 to 3.1 megapixels; three one-megapixel systems-on-a-chip (SOC), the AS0147AT, AS0148AT, and AS0149AT. This platform allows us to offer a scalable approach to our image sensors for our customers who have various needs we must consider. The advantage of taking products from the same platform is that customers can reuse data gathering across these platforms. The products can share all the driver development, the tuning, and register to learn because the pixel performance remains identical across the product platform.
Q: Is the Hayabusa Image Sensor Platform native to the automotive segment, or does it have use-cases in other applications?
A: The origin of the Hayabusa technology is rooted in our high-end cinematic gravity portfolio of sensors. onsemi worked with ARRI in the past two decades on high-quality digital cameras—used in Hollywood films and at the Oscars. In addition to the application in cinematography, we have been able to adapt the Hayabusa technology for automotive and industrial cameras, amongst many other applications. Additionally, we see a trend toward using the Hayabusa platform as a vision sensor for robotics. We continue to see a move towards robotic automation on factory floors where lighting conditions vary greatly. Adding this high dynamic range (HDR) capability to robots proves beneficial. So to answer your question, yes, there are more extensive use cases for the Hayabusa platform outside of passenger vehicles.
Q: What types of systems will utilize the Hayabusa platform?
A: There are multiple uses for the Hayabusa platform. Viewing is definitely an important use for this platform, and we can utilize the Hayabusa in surround-view cameras, rear-view cameras, camera monitoring systems, and digital viewing recorders (DVRs). Additionally, we see an upswing in advanced driver-assistance systems (ADAS) and autonomous driving integrating this platform. We can find these products in high-end and low-end applications, attesting to the platform’s flexibility.
Q: During the development phase, what types of challenges presented themselves?
A: Before we jump into the challenges faced in developing this platform, I think we must take a step back and consider the challenges faced by our customers. As a leader in automotive, with onsemi owning over 40% of the market and over 70% of the market in ADAS, there were three key challenges we faced.
The first of these challenges was the dynamic range. Much of the scenery that we register is often more than 120-240 dB. Dynamic range is the ratio between the largest and smallest values that a certain quantity can assume. We, human drivers, have all faced situations like exiting a tunnel or going over an overpass when a significant amount of light hits our face and blinds us suddenly. This sudden blindness can cause accidents and fatalities. We were motivated to find a solution in developing these sensors and overcome these dynamic changes in lighting. The challenge became to create a product that can clearly identify high dynamic range conditions and identify the objects in the scene for automotive applications. This ability to accurately identify the scene under changing conditions allows the platform the capacity to assist the driver in making appropriate decisions.
The second challenge was to overcome lighting flicker via LED flicker mitigation (LFM). When we look at traffic signals, they appear to be constantly emitting light. LED lights are pulsating at a rate that the human eyes cannot register; however, an image sensor can see and catalog these pulses. Suppose the image sensor captures a traffic signal while in the off-cycle. The sensor could wrongly interpret a traffic indicator. The Hayabusa platform allows us to capture LED signs and signals accurately. What differentiates the Hayabusa platform from others on the market is the ability to both capture dynamic range and mitigate LED flicker.
The third obstacle before us was the ability to detect objects in low-light situations. For example, suppose a driver operating at night confronts another vehicle approaching with its lights on full beam, and at the same time, a pedestrian is crossing the street. In that case, the approaching vehicle’s bright lights can obfuscate the pedestrian. It would be conceivable for us to miss seeing the pedestrian in this situation altogether as human drivers. With the capability to expose the image sensor for an extended time, without the risk of saturating the image sensor against bright objects, we can increase low light performance to capture such objects, which, in this example, is the pedestrian. As you can see, this is a significant safety feature and an important selling point.
Q: Earlier, you mentioned the way that light has the potential to affect the Hayabusa platform, and you discussed the challenges the team needed to overcome in regards to those environmental changes. In the same respect, can factors like weather or temperature adversely affect the performance of this platform?
RA: That would be true for these and all image sensors overall. As temperature increases, an anomaly called dark signal current occurs, and dark signal non-uniformity measures how well the system performs in low light situations. This dark current increases by double for every eight degrees the temperature rises. Therefore, you want to start with the lowest possible number of dark current because the sensor perceives this dark signal as more “noise.” This “noise” can lead to a false positive or a false negative regarding what the image sensors recognize. The Hayabusa platform lowers the floor for potential noise, increasing its reliability and accuracy, reducing the overall current, and increasing the efficiency for low light recognition from a pixel chart perspective.
Q: What safety measures are in place to prevent the image sensors from failing and potentially causing a risk to the passengers and vehicle?
A: The Hayabusa platform supports ASIL-B, and the products are ISO 26262 compliant, an automotive safety standard. We can detect faults that could be critical to system evaluations and reactions with these sensors. There are real-time safety mechanisms on board that allow the system to appraise the response of every frame recognized, leading to a safer design and controllability for our customers.
Q: What makes the Hayabusa platform the “go-to” product?
A: Hayabusa has not only the best-in-class high dynamic range on the market but also has superior low light performance. It has both dynamic range and flicker mitigation in a single simultaneous product, which is critical because other products usually encompass just one or the other. The Hayabusa platform has automotive-grade reliability, ACE-Q100 Grade 2, and the safety mechanisms discussed earlier. From a reliability and performance standpoint, it supersedes anything at this resolution.
onsemi has a very strategic position in today’s ecosystem. The Hayabusa platform is a very complex system that consists of a lens, a module, and an interface from the module, requiring us to work with other companies for the serializer. Another aspect is our ability to tune the image sensor with other processes in the market to ensure they have the right image processing and final image quality. These items are all part of the ecosystem that has to be in place and be mature enough for a customer to integrate the image sensor into their design. This key feature is where we excel: the ability to have a mature ecosystem with the lens, the module, the interface, and the processor. onsemi leads the industry in these critical items to help reduce the time customers need to integrate the image sensor into their system.
Visit our website for more information about our image sensors and ADAS solutions.
Additional resources: