More stories

  • in

    Fly brains can detect threatening drones

    Striped hover fly.
    Peakpx
    Bio-inspired design has been a hallmark of technological advancement, and that’s still true in the age of flying robots. The latest proof comes out of Australia, where researchers have mapped the visual systems of hovering insects as a means of detecting the acoustic signatures of drones up to 2.5 miles away.Anthony Finn, University of South Australia Professor of Autonomous Systems, says that insect vision systems have been mapped for some time now to improve camera-based detections. But applying the same method to acoustic data represents a major innovation. 

    “Bio-vision processing has been shown to greatly increase the detection range of drones in both visual and infrared data. However, we have now shown we can pick up clear and crisp acoustic signatures of drones, including very small and quiet ones, using an algorithm based on the hover fly’s visual system,” Finn says.The potential applications of the research, of course, include military and defense uses. In addition to the University of South Australia and Flinders University, defense company Midspar Systems participated in trials using bio-inspired signal processing techniques. Such techniques, according to the researchers, show up to a 50% better detection rate than existing methods.The hover fly, which can hover above plants to collect nectar, was chosen because of its superior visual and tracking skills. Dark lit regions are visually very noisy, but insects such as the hover fly can process and capture visual signals with remarkable effectiveness. Mapping this same processing technique to acoustic detection resulted in a substantial increase in detection capabilities, including in noisy environments. “Unauthorised drones pose distinctive threats to airports, individuals, and military bases,” says Finn. “It is therefore becoming ever-more critical for us to be able to detect specific locations of drones at long distances, using techniques that can pick up even the weakest signals. Our trials using the hoverfly-based algorithms show we can now do this.”The researchers specifically looked for patterns (narrowband) and/or general signals (broadband) to pick up drone acoustics at short to medium distances. The new bio-inspired processing technique improved detection ranges by between 30% and 49%.The findings have been reported in The Journal of the Acoustical Society of America. More

  • in

    Supply chain woes? Say hi to the world's smartest forklift

    OTTO Lifter smart autonomous forklift.
    OTTO Motors
    A new robot forklift with some serious smarts is debuting at MODEX, the largest supply chain conference in the Americas. The conference is getting a lot more attention than usual amid ongoing global logistics pressures, and so is the case for a major automation overhaul in the logistics sector.

    Innovation

    Into that fray enters OTTO Motors, one in a growing number of robotics firms specializing in materials handling for a majorly strained global logistics paradigm. OTTO’s newest robot forklift, named Lifter, is billed as the smartest forklift in the market and adds to the company’s lineup of autonomous mobile robots (AMRs) that, taken together, can do much of the materials handling work of a major shipping and receiving warehouse.That’s a fairly astounding development given how little automation penetration there was in the sector until the 2010s.”Over the last decade, our AMRs have solved material handling challenges for some of the largest companies in the world, but our customers have made it clear that there was a missing member of the team — one that could pick up a pallet on its own,” said Chief Executive Officer and founder of OTTO Motors Matt Rendall. “We’ve answered that call with OTTO Lifter — your new forklift driver. We put years of autonomous driving experience into OTTO Lifter, making it the smartest forklift on the market.”What makes the new generation of AMRs like Lifter so useful is their ability to navigate in crowded and dynamic semi-structured environments, a feat made possible by a host of sensing technologies that have rapidly fallen in price over the past few years and intelligence such as dynamic path planning, lane tending, and intelligent pallet detection. Also: Giant 180-ton robot trucks are mining goldAccording to the company, all that adds up to a safer forklift, which is no small thing. Traditional forklifts account for 10% of all physical injuries in workplaces where they’re used, an average of 85 deaths annually, and nearly 35,000 serious injuries in the U.S. alone, according to OSHA statistics. 70% of all industrial accidents are caused by operator error, according to the National Safety Council, and 69% are working through fatigue. Remarkably, OTTO boasts three million hours of material handling driving without a single safety incident.Of course, if safety is a major concern, it’s probably trumped in the market by financial incentives. OTTO estimates that its forklift costs about $9 per working hour. That’s an important figure as it falls substantially below the federal minimum wage. OTTO, like most automation vendors, emphasizes that its robots are meant to work alongside people and augment human workforces — and that’s substantially true of most AMRs on the market today — but it’s a sure bet that operations managers will weigh that $9 per hour against the projected cost to run a manned forklift.All of this paints a pretty clear picture of how automation is beginning to tip the balance in a variety of industries, from logistics and manufacturing to spaces like construction. The blended era of humans and autonomous machines collaborating in the same spaces is very much here, and the next few years will see AMRs crop up in a variety of more generally visible spaces, including hospitals and restaurants.For now, if you want a pallet lifted, there’s a smart forklift ready and willing to work for cheap. More

  • in

    Chipotle is testing a new tortilla chip robot (no, really!)

    Chipotle
    We love our robots, and the quirkier the better. Doesn’t get much more smile-inducing than a new model from food service robotics pioneer Miso Robotics, which is designed to cook and freshly season tortilla chips to order.Miso is a growing tech company to watch, an early leader in the push to automate fast food, at least when it comes to the actual cooking part. Miso’s burger and chicken wing preparing robots (Flippy is the best known) tend a griddle just like a human chef, making them easy to integrate into existing kitchens, and have scored big votes of confidence from national chains like White Castle and Buffalo Wild Wings.Chipotle is the latest brand to dip a toe in automation. The chain is partnering with Miso on a robot named Chippy, an autonomous kitchen assistant that integrates culinary traditions with artificial intelligence to make tortilla chips. “We are always exploring opportunities to enhance our employee and guest experience. Our goal is to drive efficiencies through collaborative robotics that will enable Chipotle’s crew members to focus on other tasks in the restaurant,” said Curt Garner, Chief Technology Officer, Chipotle.One of the big draws for national brands at this early adoption stage is Miso’s strategy of customization. Chipotle’s culinary team guided Miso in tailoring its technology to replicate Chipotle’s exact recipe – using corn masa flour, water and sunflower oil – to cook chips that are indistinguishable from human made counterparts. Chipotle’s chips are finished with a dusting of seasoning and a hint of fresh lime juice.”Everyone loves finding a chip with a little more salt or an extra hint of lime,” said Nevielle Panthaky, Vice President of Culinary, Chipotle. “To ensure we didn’t lose the humanity behind our culinary experience, we trained Chippy extensively to ensure the output mirrored our current product, delivering some subtle variations in flavor that our guests expect.”That’s an interesting window into one of the pitfalls (and possible opportunities) of automation. Much like the unplanned artifacts and saturated colors of vintage film, something can be lost in the pursuit of technologically abetted perfection. Miso’s robot, then, was trained to embrace some measure of inconsistency.Chippy is currently being tested at the Chipotle Cultivate Center, Chipotle’s innovation hub in Irvine, Calif., and will be integrated into a Chipotle restaurant in Southern California later this year. More

  • in

    How MIT's robot Cheetah got its speed

    MIT
    There’s a new version of a very quick quadrupedal robot from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). While four-legged robots have garnered no end of attention over the last couple years, one surprisingly quotidian skill has been elusive for them: running.That’s because running in a real-world environment is phenomenally complex. The quick pace leaves scant room for robots to encounter, recover from, and adapt to challenges (e.g., slippery surfaces, physical obstacles, or uneven terrain). What’s more, the stresses of running push hardware to its torque and stress limits. MIT CSAIL PhD student Gabriel Margolis and Institute of AI and Fundamental Interactions (IAIFI) postdoc fellow Ge Yang recently told MIT News: 

    In such conditions, the robot dynamics are hard to analytically model. The robot needs to respond quickly to changes in the environment, such as the moment it encounters ice while running on grass. If the robot is walking, it is moving slowly and the presence of snow is not typically an issue. Imagine if you were walking slowly, but carefully: you can traverse almost any terrain. Today’s robots face an analogous problem. The problem is that moving on all terrains as if you were walking on ice is very inefficient, but is common among today’s robots. Humans run fast on grass and slow down on ice – we adapt. Giving robots a similar capability to adapt requires quick identification of terrain changes and quickly adapting to prevent the robot from falling over. In summary, because it’s impractical to build analytical (human designed) models of all possible terrains in advance, and the robot’s dynamics become more complex at high-velocities, high-speed running is more challenging than walking. What separates the latest MIT Mini Cheetah is how it copes. Previously, the MIT Cheetah 3 and Mini Cheetah used agile running controllers that were designed by human engineers who analyzed the physics of locomotion, formulated deficient abstractions, and implemented a specialized hierarchy of controllers to make the robot balance and run. That’s the same way Boston Dynamics’ Spot robot operates.

    [embedded content]

    This new system relies on an experience model to learn in real time. In fact, by training its simple neural network in a simulator, the MIT robot can acquire 100 days’ worth of experience on diverse terrains in just three hours. “We developed an approach by which the robot’s behavior improves from simulated experience, and our approach critically also enables successful deployment of those learned behaviors in the real-world,” explain Margolis and Yang. “The intuition behind why the robot’s running skills work well in the real world is: Of all the environments it sees in this simulator, some will teach the robot skills that are useful in the real world. When operating in the real world, our controller identifies and executes the relevant skills in real-time,” they added.Of course, like any good academic research endeavor, the Mini Cheetah is more proof of concept and development than an end product, and the point here is how efficiently a robot can be made to cope with the real world. Margolis and Yang point out that paradigms of robotics development and deployment that require human oversight and input for efficient operation are not scalable. Put simply, manual programming is labor intensive, and we’re reaching a point where simulations and neural networks can do an astoundingly faster job. The hardware and sensors of the previous decades are now beginning to live up to their full potential, and that heralds a new day when robots will walk among us.In fact, they might even run. More

  • in

    Drone delivery nearer to take-off following latest FAA recommendations

    By Stanisic Vladimir — Shutterstock
    No seriously, drone delivery is coming. If you were skeptical before, an FAA committee just took a huge step with the compliance groundwork to make that a reality.The so-called Beyond Visual Line of Sight Aviation Rulemaking Committee (BVLOS ARC) of the FAA published its final report last week. The committee is charged with paving the way toward broader commercial use of drones in the U.S., and its findings are being widely applauded by many in the sector who have sought a broader scope for commercial drone operations, including in applications like search and rescue and delivery.

    “Around the world, commercial drones are saving lives, making jobs more efficient, inspecting infrastructure at scale, and growing the economy,” said Lisa Ellman, Executive Director of the Commercial Drone Alliance, an industry trade group. “But here in the U.S., existing regulations hold back the drone industry by unnecessarily applying incongruous standards and approaches designed for crewed aircraft. This ARC report outlines a common-sense, risk-based, performance-based approach that balances safety with innovation, and will enable drone-based operations to scale in the U.S. for the benefit of all Americans.”Also see: Watch these autonomous drones zip through the woodsIndustry advocates have argued that unlocking the BVLOS marketplace will advance progress across a number of areas, including sustainable transportation, carbon emission reduction, equitable access to medicines and vaccines, safer and more effective critical infrastructure inspection, emergency response, aerospace jobs, and domestic manufacturing.The chorus on the other end of the spectrum hasn’t been all that loud, perhaps a function of the relatively obscure rulemaking processes at work, to which the industry is paying close attention but average consumers may not be.A common industry argument is that the U.S. has lagged behind Europe in efforts to integrate drones into the National Airspace in large part due to the limitations of the regulatory framework and the federal bureaucracy’s struggle to move nimbly. The recent FAA report gives the clearest indications yet of what a coming BVLOS regulatory framework will look like. The committee gave recommendations on things like pilot training requirements, right of way, and rules for third-party providers, such as commercial delivery vendors. Groups like the Commercial Drone Alliance, a non-profit organization led by leaders in the commercial drone and advanced air mobility industries, have long advocated for such recommendations, an interesting case where industry leaders have felt hamstrung by a lack of government guidance.In January, Congress issued a directive to the FAA to finalize and disclose its BVLOS plans within 90 days, prioritizing rulemaking around the issue. More

  • in

    Giant 180-ton robot trucks are mining gold

    SafeAI
    A mining outfit in Australia is making a big bet on big robots. Following a recent proof of concept at a gold mine, mining contractor MACA will retrofit a fleet of 100 very large vehicles to create one of the largest autonomous heavy equipment fleets in the world.This is a pretty significant rollout and a proverbial canary in the gold mine for the sector’s broader automation ambitions. With the world hungrier than ever for precious and rare earth metals, technology is increasingly called on to make mining operations more efficient and cost-effective while unlocking increasingly scarce resources.

    Powering the new rollout is autonomous heavy equipment company SafeAI and its Australian partner, Position Partners. This new generation of autonomous heavy vehicle technology is a major upgrade from the first generation retrofits, which had limited onboard processing power and took a long time to see ROI in most cases. Early versions of autonomous vehicle technology in the sector also operated with closed legacy systems, preventing mixed fleets from communicating. Industries like mining have had this tech for 20 years now, but the lack of accessibility means it hasn’t really taken off yet.Autonomy 2.0 is changing that. AI-powered and armed with multimodal sensors (lidar, radar, camera), these new systems have significant onboard processing power to reduce network reliance and enable fast decisions. It’s also open, interoperable, and vehicle-agnostic — meaning tech like SafeAI’s retrofit autonomy can be applied to any vehicle, at pretty much any age from any manufacturer.”This technology is a game changer for our business, our customers and our industry,” explains Shane Clark, MACA’s General Manager of Estimating and Technical Services. “SafeAI’s versatile, scalable solution is unmatched in our industry right now, and has profound implications for site safety, efficiency and cost-effectiveness. We expect to see quick takeup from our customers as they begin to see the tremendous impact of this technology.” This means much greater scalability — like this 100 truck agreement — to accelerate the rollout of autonomous equipment for industries that are ready. One big benefit of autonomy is that it creates far safer working conditions for on-site workers.Of course, it also means that extractive industries like mining are becoming increasingly efficient. As personal computing and battery technologies increasingly drive demand for mined resources, automation technologies are propelling those industries to ever greater capabilities, a cycle that warrants increasing vigilance. More

  • in

    Digital radar on a chip speeds autonomous vehicle adoption

    Uhnder radar-on-a-chip
    Uhnder
    A new radar-on-a-chip is on the way this year, capable of mass production and 4D digital imaging, heralding a new chapter for autonomous vehicles and next-gen advanced driver assistance systems (ADAS). Uhnder, a firm based in Austin, Texas, expects its digital radar to be automotive qualified in April 2022 and will debut on consumer production vehicles later this year.Radar was late to the game in autonomous vehicles, early developers of which preferred lidar and visual camera sensing suites. But radar has distinct advantages. For one, it’s a fantastic sensor for collision avoidance, particularly when used in concert with other sensing modalities. One disadvantage, however, has been its weight and form factor.Uhnder believes it has solved this with its radar-on-a-chip solution, a compelling example of how ADAS and autonomous vehicle development is helping drive evolution in sensing. “Uhnder’s 4D digital imaging radar-on-chip is a next-generation product that demonstrates new ways to advance automotive safety to save lives,” said Douglas Campbell, president of the Automotive Safety Council. “Fatalities of vulnerable road users are now 20 percent of all roadway deaths in the US and even more in developing countries. ADAS technologies, such as pedestrian automatic emergency braking (P-AEB) that can reliably operate at night, can help reduce pedestrian fatalities per the latest report from the Insurance Institute for Highway Safety. Improved high-resolution perception sensors, such as Uhnder’s radar-on-chip can potentially help reduce this rising fatality category.”Other firms have pursued radar-on-a-chip development, including Imec, one of the big players in commercial radar. In 2020 Imec presented a chip that processes radar signals using a recurrent spiking neural network. Its compact size made it easily deployable in drones and autonomous mobile robots, applications where size and weight are determinant factors for sensing systems.Uhnder is taking square aim at the vehicle space and claims its technology delivers the industry’s first digital radar solution with better accuracy and the power to sense moving or standing objects, large or small, at both short and long distances in all weather and lighting conditions, all while mitigating mutual interference between other radars, a big concern with analog radar. “Digital radar provides 16 times better resolution, 24 times more power on target, and 30 times better contrast than today’s analog offerings, improving detection capabilities for better road safety for all users – drivers, passengers, cyclists, and pedestrians,” said Manju Hegde, CEO and cofounder of Uhnder, Inc. “As more and more radars are fitted onto vehicles and other mobility solutions, interference among adjacent radar becomes problematic. Our radar, based on Digital Code Modulation, mitigates this problem.”The first tier 1 automotive customer for the 4D digital radar chip will be Magna, and Uhnder hopes to expand its customer base quickly.”I am excited with the progress made over the years and ready for the entire ecosystem to experience what we already know to be true – that digital radar is fundamental for next-generation ADAS,” said Swamy Kotagiri, CEO, Magna International. More

  • in

    Robot fry cook gets job at 100 White Castle locations

    White Castle
    White Castle seems to be all-in on its latest employee, a robotic fry cook. Flippy 2, the fast-food robot by Miso Robotics, will now be whipping up burgers and other food in 100 standalone locations.

    The news is part of a larger shift underway in the quick-serve sector, driven in part by the demand for contactless service and in part by a tight labor market and rising wages toward automation. Just this week, in a similar move, Jamba announced it was strengthening its collaboration with Blendid, which makes a juice robot.White Castle first trialed Miso’s original Flippy robot in a Chicago-area location in 2020. The burger chain, which bills itself as the first hamburger fast-food chain in-country (it was founded in 1921), then rolled out a version of Flippy, Robot-on-a-Rail (ROAR), to an additional 10 kitchens.”Artificial intelligence and automation have been an area White Castle has wanted to experiment with to optimize our operations and provide a better work environment for our team members,” said Lisa Ingram, CEO of White Castle, at the time. “We believe technology like Flippy ROAR can improve customer service and kitchen operation. This pilot is putting us on that path — and we couldn’t be more pleased to continue our work with Miso Robotics and pave the way for greater adoption of cutting-edge technology in the fast-food industry.”Also: Sam’s Club betting its cleaning robots can do double dutyThe sales pitch by Miso is that its robot can alleviate inefficiencies in the back of house while ensuring consistent quality. Given the scope of the rollout, White Castle clearly deems the ROI equation valid.”We could not be more grateful for the confidence White Castle has shown in us as we enter into the next phase of our partnership,” said Mike Bell, CEO of Miso Robotics. “White Castle was the first large brand to embrace our technology,  and we are thrilled that our Flippy pilot made such a positive impact on their operations that they want to integrate 100 more. We can’t wait to continue on this journey with such an outstanding partner.”         

    Miso’s journey, which we’ve covered since the company came out of stealth, has been fun to watch. The company did a non-traditional crowdfunding campaign and is primarily funded by individual investors. It boasts over 15,000 shareholders and a whopping  $50M in crowdfunding to date. Its E round gives it a market valuation of $500 million. More