Ovie’s ‘Smarterware’ smart food storage aims to help reduce food waste

May 22, 2018 by  
Filed under Green

Comments Off on Ovie’s ‘Smarterware’ smart food storage aims to help reduce food waste

Around 40 percent of food goes to waste in America yearly, which costs the average family of four about $2,000 a year. Luckily, Chicago startup Ovie has an answer to this problem: Smarterware. Ovie’s Smart Tags, which keep track of food items’ freshness, can be clipped on food, placed on six-cup containers, or attached to bottles or take-out boxes. According to the company, their system essentially transforms any regular refrigerator into a smart fridge, but without the steep price tag — and they’re crowdfunding on Kickstarter right now. Ovie’s Smarterware aims to change how people eat by helping them keep track of their food’s freshness level. Rings around their Smart Tags light up as green, yellow or red to let people know if food is safe, about to spoil, or has gone bad. Using the technology is simple: you just press the button on a Smart Tag, and your food is tagged via Amazon Echo or an app. Related: New refrigerator camera takes aim at food waste The app aims to help users really take advantage of what’s in their fridge, letting them see items they’ve tagged or even search for recipes that will use the tagged ingredients. The app notifies users when the light ring hits yellow and offers recipe suggestions. Ovie also plans to send a personalized recap every month to let users know how they’ve been doing and provide tips based on their consumption trends. Ovie CEO and co-founder Ty Thompson said in a statement, “People don’t want to waste all of this food — it just happens. We’re busy, we invest time and resources to make a great meal, and then we end up throwing away a large amount of food simply because we forget about it. We wanted to help solve this problem by creating a product that would be simple to use and bring a more mindful approach to food storage .” You can snag early bird discounts on Ovie’s Kickstarter , which ends June 21. The company plans to start shipping in early 2019. + Ovie + Ovie Smarterware Kickstarter Images courtesy of Ovie

Here is the original:
Ovie’s ‘Smarterware’ smart food storage aims to help reduce food waste

AEye reveals new, energy-efficient data type to help self-driving cars see

May 22, 2018 by  
Filed under Eco, Green

Comments Off on AEye reveals new, energy-efficient data type to help self-driving cars see

AEye , a San Francisco Bay Area-based company that develops hardware, software and algorithms that serve as the “eyes and visual cortex” of autonomous vehicles, has announced a new data type that will help self-driving cars see better and, as a result, reduce energy costs. The data type combines pixels from digital 2D cameras with voxels from 3D LiDAR (Light Detection and Ranging) sensors. By joining these into a unified high-resolution sensor data type known as Dynamic Vixels, AEye has created a format through which autonomous cars can more effectively evaluate a situation based on 2D visual algorithms. In addition, the new data type allows the cars to use 3D and even 4D information on an object’s location, intensity, and velocity. This efficient format has yielded a visual system that is faster and more accurate while using eight to ten times less energy in the process. AEye believes that integration of data types is essential for improving the capacity of autonomous vehicles while reducing energy costs through more efficient computing. “There is an ongoing argument about whether camera -based vision systems or LiDAR-based sensor systems are better,” said AEye Founder and CEO Luis Dussan. “Our answer is that both are required – they complement each other and provide a more complete sensor array for artificial perception systems. We know from experience that when you fuse a camera and LiDAR mechanically at the sensor , the integration delivers data faster, more efficiently and more accurately than trying to register and align pixels and voxels in post-processing. The difference is magnitude better performance.” Related: Lyft launches self-driving cars on the Las Vegas strip AEye’s iDAR (Intelligent Detection and Ranging) perception system is modeled on human visual perception, using biomimicry to better equip autonomous vehicles for the safest ride possible. For example, iDAR is able to assess a child’s facial direction and determine the probability of whether this child will walk into the street, preparing the vehicle to stop in case the child does. “There are three best practices we have adopted at AEye,” said AEye Chief of Staff Blair LaCorte. “First: never miss anything; second: not all objects are equal; and third: speed matters. Dynamic Vixels enables iDAR to acquire a target faster, assess a target more accurately and completely, and track a target more efficiently – at ranges of greater than 230m with 10% reflectivity.” AEye plans to release the AE100 artificial perception system, its first iDAR-based project, this summer. It will be available to Tier 1 and Original Equipment Manufacturer companies that have autonomous vehicle initiatives. + AEye Images via AEye

Originally posted here:
AEye reveals new, energy-efficient data type to help self-driving cars see

Bad Behavior has blocked 886 access attempts in the last 7 days.