More stories

  • in

    Gas stations are losing (Here's a novel way they'll adapt)

    Abandoned gas station.
    Pixabay
    Gas station visits have been steadily declining due to the rise in sustainable mobility options like electric and autonomous vehicles. At the same time, they occupy valuable urban real estate, and are set to derive an ever-increasing share of their revenue from convenience stores rather than fuel pumps. Can gas stations save themselves from redundancy by better utilizing their physical space and becoming ecommerce hubs?The rise of ecommerce has presented an opportunity for retailers to analyze consumer needs and tailor their offerings to maximize revenue. Stor.ai, a digital commerce solution for grocers, believes gas station owners can jump on this trend and move from a vehicle-centric model to a customer-centric one. Gas stations today are already repurposing their store space to enable shoppers to receive deliveries at their convenience.I reached out to Mendel Gniwisch, CEO of Stor.ai, to discuss how fuel retailers can reinvent the customer journey and use digital tools to extend the customer relationship beyond occasional visits to the service station.GN: Let’s start with basics. What is the Stor.ai concept, particularly its innovative utilization of space?Mendel Gniwisch: Stor.ai was founded in order to assist grocers with their digital transformation by combining digital customer engagement across all touchpoints into one platform. Previously, online grocery shopping had developed separately from the in-store experience, resulting in a fractured shopping experience characterized by disparate digital touchpoints. With most shoppers now combining in-store visits with some use of ecommerce, stor.ai leverages the latest in AI and personalization technology to help grocers retain their unique brand loyalty while meeting evolving customer expectations, online and in-store.Most retailers sign long-term leases on their stores, but when they signed those contracts 10 or 20 years ago few could have foreseen just how quickly ecommerce would skyrocket. Retailers are faced with a new reality of grocery stores serving both shoppers and pickers and a growing demand by customers for hyper-efficiency, speed and personalization. Stor.ai’s end-to-end digital transformation solution helps retailers use the space at their disposal as efficiently as possible. With the grocery industry’s future set to be defined by a fusion of in-store and online shopping, in-store real estate needs to be maximized to help retailers meet customers’ ever-increasing expectations for quick, friction-free fulfillment.

    Our picking-app allows for efficiency for the retailer, especially now with increasing labor shortages, while our platform ensure convenience and personalization for the consumer ensuring that the customer benefits at every touchpoint. GN: What are some of the obstacles gas stations are confronting amid changing driver behavior and new mobility technologies?Mendel Gniwisch: Gas station visits have been steadily declining due to the rise in sustainable mobility options like electric and autonomous vehicles. Simultaneously, due to the pandemic, fewer people are driving into work on an everyday basis.Fuel retailers are in a position where they need to rethink their strategies, build their capabilities, and transform their businesses to support these serious changes. Otherwise, the changing way in which fuel is consumed risks making gas stations redundant – which is especially threatening for their owners and franchisees given the value of the real estate they occupy. That’s why the gas stations of the future will be expected to offer an expanded range of flexible and needs-based shopping options in order to first survive and then thrive.GN: What does it mean to move from a vehicle-centric model to a customer-centric model? What’s the vision for gas stations under Stor.ai’s influence?Mendel Gniwisch: Moving from a vehicle-centric to customer-centric model entails reinventing the customer journey by using digital tools to extend the customer relationship beyond occasional visits to the gas station.By focusing on addressing the needs of customers, gas station owners can offer value even when drivers don’t need to fill up on fuel. The goal is to create a seamless, engaging customer experience that goes beyond the traditional service station offering.The rise of ecommerce is affecting all retail verticals, but the need to adapt is especially pressing for gas stations given that their traditional offering is predicted to become increasingly less relevant.As fuel becomes less critical, gas stations are left with two principal assets: their location (often prime real estate in or adjacent to cities), and their small on-site convenience stores. The gas station of the future will invest in developing new digital functions and new technology capabilities that fit into consumer trends to streamline the shopping experience. They can expand pre-existing offerings, build a click-and-collect infrastructure, or even place a dark store on site.From delivery on the go to the frictionless customer experience, fuel retailers concerned about the decline in fuel demand can find growth opportunities elsewhere by increasing operational efficiency from their real estate and refocusing their energies on convenience retail.GN: A lot of this is about empowering smaller retailers with tools that recently have been the exclusive domain of major brands. Where else are you seeing opportunities for mom & pops to redefine their role and operations?Mendel Gniwisch: Across the board retailers want to retain control of how they use customer data rather than farming it out to third parties, helping them tailor their products as best as possible to customers’ needs and prioritize consumer-first commerce. Third-party providers are ideal for a short-term fix to manage an instant digital transition, but this comes at a cost: The provider might initially drive traffic and revenues, but very soon, the customers could be the provider’s rather than your own. Instead, retailers are looking for independent solutions like stor.ai to help them build their own online presence, own their customer interactions and data, and offer shoppers a digital experience that reflects the unique characteristics of individual brands. GN: What’s next for Stor.ai? Where does the company expect to have its biggest wins in 2022?Mendel Gniwisch: Most customers now shop in a hybrid manner, meaning that they do some of their shopping through digital methods, and some of their shopping using traditional, pre-digital methods. Stor.ai is embracing this changing reality, and in 2022 we are devoting our efforts toward helping retailers combine the most impactful features of both online and offline shopping and offer the best of both worlds. Retailers will increasingly be expected to bring the efficiency and seamlessness of online shopping to the brick-and-mortar store while also bringing the experiential highs and unique in-store experience to the online realm.I see the optimal model for the store of the future as having a totally personalized experience, using the latest AI, AR and VR technologies to help customers enjoy the smells, sights and sounds that could at one point only be experienced in person. More

  • in

    Sam's Club betting its cleaning robots can do double duty

    Sam’s club’s floor scrubbing robots.
    Brain Corp
    Sam’s Club will soon be asking robots to do double duty. The membership warehouse club is undergoing a national, chain-wide rollout of an inventory scanning feature that will be added to existing floor scrubbing robots.

    Innovation

    The move suggests an interesting new chapter for Walmart Inc, owner of Sam’s Club. One of the biggest robotics stories of the last few years came when Walmart killed a 500 store deployment of shelf-scanning robots developed by automation firm Bossa Nova, which marked the end of the technology’s highest-profile test case to date. In the wake of the cancelled contract, developers of inventory scanning robots scrambled to differentiate their technology and prove that the fate of one company’s contract meant little to the technology’s long-term prospects.The latest rollout by Sam’s Club, which marks a return to autonomous inventory scanning by a Walmart brand, supports that thesis. “Sam’s Club is hyper-focused on making sure our members have a seamless shopping experience, so any time-saving innovation we can implement is significant. By adding Inventory Scan to our current fleet of robotic scrubbers, we obtain critical inventory data that previously was time-consuming to obtain,” said Todd Garner, VP of In-Club Product Management at Sam’s Club. “This intelligence allows us to proactively manage our clubs in an efficient manner. Inventory Scan assures items are available and easy to locate in the club, freeing up time for our associates to focus on members and the shopping experience they deserve.”Also: Robotaxis get new learning strategies to face “the edge”This is a noteworthy deal for the robotics sector insofar as it’s a good illustration of what automation is going to look like “in the wild” in the coming years. Brain Corp, which has been quietly building an empire based around robotic scrubbing machines, isn’t glitzy by robotics development standards. However, the company’s AI-powered machines are massively popular amid ongoing labor shortages and pandemic-related shifts in how commercial spaces are utilized. While other companies are manufacturing standalone inventory scanning robots, Brain Corp has been building on its success over the past few years by diversifying the capabilities of its robots.The add-on scanning accessory will be fitted to the almost 600 autonomous floor scrubbers already deployed within Sam’s Club stores nationwide. These towers, powered by Brain Corp’s AI operating system, BrainOS, and manufactured by Tennant Company, will capture data as the robots move autonomously around the store. Reports are then delivered to the Sam’s Club managers and provide insights like verification of pricing accuracy, planogram compliance, product stock levels, and product localization. Each function negates the need for time-consuming and manual processes, reducing waste and inventory loss.  

    “This latest iteration of our valued and longstanding partnership with Sam’s Club marks the beginning of realizing the next phase in our company’s vision,” said Dr. Eugene Izhikevich, CEO of Brain Corp. “We are actively taking BrainOS-powered robots from primarily task-oriented machines to in-store data acquisition platforms, able to deliver actionable insights on inventory availability, planogram compliance and more. This adds significant ROI for retailers.”  More

  • in

    Electric Sheep turns old lawnmowers into robots

    A “dumb” lawnmower is made autonomous with bolt-on kit.
    Electric Sheep

    Innovation

    A company that turns old lawn tech into state of the art robots got a big vote of confidence via major fundraising. Electric Sheep, whose name harkens to the Philip K. Dick novel upon which Blade Runner was based, just announced a $21.5 million Series A to teach old lawn care tech to do new tricks.The company’s success represents an important bellwether for robotics adoption. Technologies like commercial trucks and lawnmowers are inevitably going to operate autonomously, and much of the tech exists to begin the transition immediately for certain users. But the economics of replacing existing fleets won’t be viable right away. In the interim, enterprises are faced with the prospect of being out-innovated by competitors.Expect add-on autonomy, then, to become increasingly important, providing a linchpin between fully autonomous technology and old fleets that still have useful life left. That’s the vision firms like autonomous driving startup Drive.ai, which retrofits cars into autonomous vehicles, and Blue White Robotics, which turns existing tractors into farm robots.Also: Should robots be able to deliver booze?Electric Sheep is taking the same strategy to the commercial mower market. It’s an excellent use case in a tight labor economy where lower-wage positions have been difficult to fill. According to the National Association of Landscape Professionals, the labor market in landscaping has been challenging for employers as tens of thousands of full-time positions are going unfilled, who also call 2021 “the worst labor market in recent history.” Against that backdrop, Electric Sheep sees a moment of transition.”Automation of the $115 billion outdoor maintenance market is an enormous opportunity hiding in plain sight,” explains Griffin Schroeder, Partner, Tiger Global, which led the recent round. “Electric Sheep is leading the way with fully autonomous solutions. We are excited to invest and help them grow their leadership position.”The company’s flagship product is called Dexter, an autonomous add-on that easily attaches to new or existing lawnmowers and requires minimal training in order to autonomously mow any type of grass. Landscapers show Dexter what to do one time, and the robot then autonomously repeats those actions. The sensor suite includes LiDAR, cameras, GPS, ultrasonic sensors for precise maneuvering across diverse terrain, and OTA firmware updates. As has been the trend with enterprise automation, the technology is being offered via an as-a-service model.

    “I don’t think people realize that lawns are America’s largest crop,” says Naganand Murty, CEO of Electric Sheep. “More land and water are dedicated to lawns than to wheat and corn combined, and more than 40 million acres of land in the U.S. has some form of lawn. $20 billion is allocated annually to lawn mowing alone*. Solutions such as Electric Sheep’s Dexter robot are helping our customers meet demand and better allocate their already scarce labor pool.”The company plans to use funds from the Series A to expand across all departments in order to meet growing customer demand. The company currently has contracts with thirty customers across the U.S., and interest is high. More

  • in

    To drill or not to drill? Maybe AI knows the tooth better than your dentist

    Have you ever gone to the dentist and been unsure if that spot on your tooth the doctor is looking at is really a cavity? Or maybe you’ve gone to get a second opinion, only to have the new practice tell you that you need a crown on a completely different tooth?   

    Unfortunately, this story is all too common in dentistry — in fact, there’s a well-known story about a Reader Digest reporter who went to see 50 different dentists and received nearly 50 different diagnoses. That makes dentistry ripe for technological innovation aimed at increasing confidence and accuracy in diagnoses. For many reasons, dentistry is the ideal frontier for AI: Not only does the field produce an abundance of x-rays, but they’re also easy to anonymize and are a great data set for AI/machine learning to scan and learn from. Additionally, the dental field doesn’t have trained radiographers the same way the healthcare industry does, which could make the extra set of “AI eyes” a welcome addition for well-intentioned practitioners.Los Angeles-based Ophir Tanz, CEO of Pearl, is one such developer hoping dental AI technology can take some of the guesswork out of dentistry, giving both patients and providers peace of mind. The son of a dentist himself, Ophir recognized the potential for AI in the industry, and after successfully standing up contextual intelligence AI company GumGum (now valued at $700M), he’s using the same tech to transform the dental industry.  Also: Has AI found a treatment for Fragile XI connected with Tanz about the future of dentistry and the impact AI could have on patient outcomes and the industry at large.GN: Why is dentistry the ideal frontier for AI?

    Ophir Tanz: The dental field is ripe for AI innovation for a couple of reasons. First, the abundance of radiographic images — patients receive dental x-rays every two years, so there are more dental radiographs in the world than any other form of medical imagery. This is extremely helpful when it comes to developing AI radiologic systems for dentistry because those systems need to be trained on large numbers of radiographs. Second, dentistry has a more entrepreneurial character than other forms of medicine. Most dentists are invested to one degree or another in a practice, so they’re not just doctors but also business owners. A dentist’s primary concern is delivering optimal patient care, which AI helps them do — but it also helps them address the business operations concerns they face as practice owners. The same AI insights that elevate the standard of care and patient outcomes can also be applied to help them make smarter decisions around budgeting, staffing, materials, equipment resourcing, etc. Innovation requires adoption, and dentists are natural early AI adopters because its benefits touch every facet of their work — and because, unlike the majority of doctors in other fields, dentists are business owners, so they have both the authority and impetus to invest in AI.  GN: There are similar applications rolling out in other medical spheres. Can you give us an overview of how AI is being used to read scans across the medical ecosystem?Ophir Tanz: There is a wide range of AI technologies being applied in other areas of medicine — not only in radiologic applications but in intake, triage, biologic testing-based diagnostics, predictive diagnostics, etc. Talking specifically about AI-based analysis of medical imagery, thousands of radiologic AI systems that have been developed over the past 15 years. The vast majority of these systems have come out of research institutions. Not all of these systems have proved useful; many of those that could be useful are effectively redundant (i.e. they perform the same task with more or less the same outcome), and not all of those where both effective and novel have found their way past the regulatory and commercial hurdles to application in the real-world. There are currently around 350 FDA-approved medical devices that apply AI in some capacity, and the vast majority of these perform some degree of analysis of medical imagery. Most help automate repetitive tasks, like anatomical segmentation. However, there are plenty of AI-powered imaging systems that perform diagnostic functions. Whatever their use — oncology, neurology, cardiology, ophthalmology, etc. —  these devices perform highly specific functions, like detecting a specific condition in a specific part of the body that can be found in a specific type of medical image. As such, the chance that anyone has ever encountered an AI system in the course of their medical care is extremely low. Naturally, this will change as AI technology becomes more generalizable and powerful — but the first medical AI that people, at scale, will ever experience is almost certain to be in a dental office. That’s true not only because people visit the dentist more frequently than they do any other kind of doctor but because we’ve been able to develop systems with broad utility in detecting a comprehensive array of dental conditions. GN: How is your technology being received by dentists, who may be accustomed to doing things a certain way?Ophir Tanz: The response we’ve seen from dentists using our solutions has been overwhelmingly positive, but that’s to be expected because early adopters more likely have a more favorable attitude about AI. There are certainly dentists out there who are skeptical. Overcoming that skepticism will require education. Once these skeptics get their hands on the technology and learn more about what it can and cannot do, they’ll realize that AI is not a threat to their profession — that it’s simply a powerful tool that enables them to perform their jobs at a higher level. I expect adoption to accelerate rapidly as AI literacy in dentistry expands and people become more comfortable with the concept of AI diagnostics in general. This is already starting to happen. We’re selling our real-time radiologic aid, Second Opinion, in Europe, Australia, New Zealand, Canada and various other territories and our AI clinical management solution, Practice Intelligence, is in use in thousands of practices domestically and abroad. These are really transformative solutions, and I believe that as we continue to gain regulatory approval in different parts of the world, dentists will be ready for AI and be quick to incorporate the technology into their daily routines.Also: Drugs by drone: Good idea?GN: How are patients responding to technology rollouts like this one?Ophir Tanz: Patient response is one of the things that dentists tell us they love most about the technology. Naturally, there’s a wow factor that this technology even exists, and patients appreciate that their dentist is applying the state-of-the-art in delivering care. Then there’s the impact of AI on the patient’s ability to understand their doctor’s diagnosis. Rather than pointing at an indistinct blotch on the radiograph and saying, “It’s hard to make out, but you have a cavity here that needs to be treated,” the doctor is showing the patient the radiograph with the cavity clearly circumscribed and labeled by the AI. The patients get a clearer understanding of what exactly is going on in their mouth, and that gives them greater confidence in the treatment recommendation. This is what dentists report to us, but I think it’s reasonable to extrapolate that the better patient communication that the AI enables is leading to greater patient trust — and hopefully improved patient retention. Now that we’re in more practices, we’re developing research looking at real-world impact to verify anecdotal accounts of patient perspectives. We’re starting that research in Germany with academic support. There are many questions we’d like to answer over time. Does AI help speed up patient visits? Do patients trust doctors who use AI more than doctors who do not? Do they accept treatment from AI-equipped doctors at a higher rate? We should have answers to some of the questions pretty soon. GN: When you think of dentistry in 10-15 years, how will technology have changed the profession and patient experience?Ophir Tanz: I expect most dental offices in the world will be applying AI in some form — and often across much of the practice workflow, both clinically and operationally. Charting, scheduling, inventory management — these kinds of tasks will be accomplished with markedly more efficiency than they are today. The time gained should deliver some combination of the following benefits: lower costs of care, more patient volume and high-quality patient-doctor interaction. From a clinical perspective, we’ll have a higher standard of patient care across the board and better population-wide oral health. At the farther end of that timeframe, I hope we’ll see AI facilitating more predictive and preventative dental care. It is not unreasonable to anticipate that we will be bringing a wide array of data points from outside of the patient’s mouth — medical records, family history, daily habits and lifestyle information — to bear both in developing individualized courses of treatment and in establishing the kinds of oral-systemic health links that have proved so hard to pin down to-date. As I noted previously, we see dentists more frequently than we do any other doctor — so it would be a wonderful thing if AI could give us insights that transform the mouth into a window to our heart, lungs or brain. That future may be more than 15 years out — but whenever we reach it, we’ll have AI to thank. 

    Artificial Intelligence More

  • in

    Should robots be able to deliver booze?

    Starship’s autonomous robot delivers on college campus.
    Starship
    A prolific early contender in the autonomous delivery race has forged a partnership with an authentication company. The reason? There’s a big market for the delivery of age-restricted items.

    There’s a larger market story behind this partnership. The autonomous delivery wars have officially commenced, and now there’s a race underway among last-mile delivery robot developers and service providers to forge strategic partnerships and carve-out service niches. Recently autonomous delivery company Nuro announced a partnership with 7-Eleven to deliver the convenience brand’s products to customers’ doors. Overall, the market for autonomous mobile robots (AMRs) and autonomous ground vehicles (AGVs) is forecasted to generate over $10bn by 2023, according to Interact Analysis. This is why the conversation has inevitably turned toward age-restricted items, such as alcohol and other sensitive deliveries that legally require identity verification, such as prescription drugs. Those spaces represent a huge delivery market, but only if the already complex regulatory paradigm for delivery robots can accommodate one more complication: foolproof identity authentication.Enter Veriff, an authentication provider that offers technologies like face match biometric analysis, identity document verification, and proof of address capture. Veriff will add an extra layer of safety and security to Starship’s autonomous delivery fleet, making it the first company in the world to create a fully autonomous end to end delivery service for age-restricted items. “Partnering with Veriff allows Starship to autonomously deliver age-restricted items in the UK and beyond as we continue to take on new markets and stores at a rapid pace,” said Ryan Tuohy, Senior Vice President of Business Development and Sales at Starship Technologies. “We are excited to work with Veriff in providing the highest quality Identity Verification solutions for our users to ensure their safety and peace of mind on our trusted platform.”Of course, problems abound with the scheme. For one, delivery robots are not in wide use thanks to the current regulatory paradigm, which is patchwork, often hyperlocal, and in many cases only now on the verge of being created. Delivery firms are being very careful to prove their technology in manageable testbeds, such as college campuses, rather than rush into cities prematurely and incur regulatory backlash (see rideshare and electric scooters).Age-restricted items add complexity to the already complex situation. I recently connected with Susan Lang, Founder & CEO of XIL Health, a complex drug pricing analytics company, about the prospects of delivering medicine via drones.

    “Most likely, companies experimenting with drone deliveries will exclude controlled substances and avoid any class two drugs because of the sensitivities involved,” says Lang. “One of the biggest challenges is that drone delivery won’t work for every type of product, so they need to test to see when it works.”That said, Starship is clearly paving the way for some kind of convenience play involving age-restricted items. The robot bartender has become a trope in the robotics sector, sort of a comical send-up of how robotics technology is being applied to slightly silly use cases. But booze delivery by a robot seems like a real market grab, and Starship is positioning itself to take the lead.Whether this just means the fake ID will become increasingly complex remains to be seen. Your move, high schoolers. More

  • in

    Hey drone industry: Quit griping, it's time to work with the FAA

    American Robotics Scout drone ready for deployment.
    American Robotics

    The commercial drone industry is expected to grow at a compound annual growth rate of 57% from 2021 to 2028 as a result of the need for better data and analytics that only drones provide. In order for drones to reach their full potential, drone developers must work with the Federal Aviation Administration (FAA) to manufacture devices that can safely and successfully operate under Aviation Rulemaking Committee (ARC) and FAA guidelines. That’s the clarion call of American Robotics, the first company approved by the FAA to operate automated drones without humans on-site, which was recently selected to participate on the FAA’s Unmanned Aircraft Systems (UAS) Beyond-Visual-Line-of-Sight (BVLOS) Aviation Rulemaking Committee (ARC) to advance BLOVS drone operations. In the eyes of co-founder and CEO Reese Mozer, the FAA’s approach to BLOVS flight for commercial drones will dictate the state of the drone industry for years to come, and it’s up to the industry to do all it can to work in lockstep with the regulator. I sat down with Mozer about why the drone sector needs to work with the FAA and what that means for the future of drone delivery and other BVLOS applications.GN: What’s the FAA’s current policy on BVLOS, and what are the FAA’s primary concerns when it comes to BVLOS?Reese Mozer: The FAA’s mission and responsibility is the safety of the National Airspace System (NAS), including people and property in both the skies and on the ground. Prior to American Robotics 2021 waiver and exemption, no company had demonstrated to the FAA safe operation without human visual observers (VOs) on-site. The reasons for this are numerous and complex and are both technological and cultural. The short explanation is that humans have been a constant presence during flight for the past hundred years, and ultimately, the primary failsafe if anything goes wrong. Shifting more of this responsibility to software and hardware required a series of technology innovations to be developed, tested, and adequately communicated to regulators at the FAA.For the past five years, American Robotics has been developing a suite of proprietary technologies explicitly designed to produce the industry-leading solution for safe automated flight. We designed these technologies in concert with a low-risk Concept of Operations (CONOPS) and conducted extensive testing and evaluation as part of a long-term regulatory strategy to prove our system’s safety. For example, the Scout System incorporates multiple novel risk mitigations, including proprietary detect-and-avoid (DAA) sensors and algorithms, advanced automated system diagnostics and failsafes, automated pre-flight checks, and automated flight path management. If anything were to deviate from the expected, safe operation plan, our drone systems take immediate action to correct, such as altering flight course and returning to the base station. By developing a layered, redundant system of safety that includes these proprietary technical and operational risk mitigations, we have proven that its drone-based aerial intelligence platform operates safely in the NAS, even when it conducts flights beyond-visual-line-of-sight (BVLOS) of both the operator as well as any humans on-site.

    GN: How do you hope the rulemaking will change through the BVLOS ARC?Reese Mozer: Our hope is that the recommendations from the BVLOS ARC will encourage the FAA to more expeditiously authorize expanded BVLOS operations on a national scale, allowing industry to meet the significant demand for automated drone-based inspection. American Robotics and others in the industry have successfully demonstrated that drones can be operated to a very high threshold of safety in the national airspace and can perform missions that are vital to society without endangering other users of the airspace or the general public. Existing regulatory pathways such as waivers and exemptions typically lack the efficiency and speed desired by industry and are often cost-prohibitive for many smaller companies to obtain. Similarly, existing Type Certification (TC) processes were designed to ensure the safety of manned aircraft operations, and applying the existing processes to drones is generally not effective due to the many sizes, technology, and risk differences between drones and manned aircraft. Within the BVLOS ARC, the drone industry has proposed streamlined means of certifying drone technology and assessing the real-world risks that BVLOS operations of drones pose. New rulemaking based on these proposals would enable expanded BVLOS operations in a safe and scalable manner while ensuring the safety of all operators within the NAS. It should be noted; however, the FAA’s stated timeline for implementing such rulemaking is 3-5 years. Thus, the existing path of waivers and exemptions taken by American Robotics is likely to persist until then.GN: Why will BVLOS take drones to a new dimension? Why is this such a critical milestone?Reese Mozer: “True” BVLOS, i.e. that where neither pilot nor visual observers (VO) are required, is critical to unlocking the full potential of the commercial drone market. The economics behind paying for a VO or pilot on the ground to continuously monitor a drone flight simply does not make sense and have significantly hampered commercial users’ ability to justify building out a drone program. It’s important to remember that flying a drone once or twice a year has little to no value for the vast majority of commercial use cases. Typically, to see the benefits of drone-based data collection, flights need to be conducted multiple times per day, everyday, indefinitely. This frequency allows drones to cover enough area, survey at the proper resolution, and detect problems when they occur. Today, the average hourly rate of hiring a drone pilot in the U.S. is about $150 and can get as high as $500/hour. Thus, overcoming the human costs associated with commercial drone use has been one of the biggest hindrances to the market and has impacted the viability and implementation of this technology on a mass scale. American Robotics’ leadership in expanding automated BVLOS operations represents a critical inflection point in the aviation, drone, and data worlds. As the first company to become approved by the FAA to operate in this manner, we have set the stage for the next generation of commercial drones. Autonomous operations enable the real-time digitization of physical assets and allow users in industrial markets to transform their monitoring, inspection, and maintenance operations. This technology represents the key to a new generation of industrial data that will bring about increased cost-efficiency, operational safety, and environmental sustainability. GN: What sectors are automated BVLOS particularly important to? Can you give some examples of how those sectors plan to use BVLOS?Reese Mozer: Automated operations, which are enabled by “true” BVLOS authorization, are required for 90% of the commercial drone market. An easy way to think about it is any use case that requires frequent inspection over the same area likely requires automated BVLOS to be practical. Example sectors include Oil & Gas, Bulk Materials & Mining, Rail, and Agriculture. Each has significant demands in terms of image resolution and frequency that can only be provided by automated BVLOS flight.Oil & GasThere are over 900,000 well pads and 500,000 miles of pipeline in the United States. Every inch of those assets needs to be continually monitored for defects and leaks to assure safety and reduce GHG emissions properly. Automated BVLOS operation is critical to enabling drones to perform these tasks on a regular basis properly. Stockpiles & MiningCurrent stockpile and mining inspections involve teams that manually estimate volumetrics, either with hand-held cameras or the naked eye, typically resulting in low-accuracy data. These incorrect measurements put a strain on operations and drastically reduced our visibility and control over the bulk materials supply chain. With automated BVLOS, we can generate a hyper-accurate volumetric analysis of stockpiles and mines every day, reducing the likelihood of global supply chain disruptions across a variety of industries.RailOver 140,000 miles of rail track in the United States require regular monitoring and inspection to assure safety. Common track defects include tie skew, tie clearance, and track impediments. Automated BVLOS allows for the scalable implementation of drones across the nation’s rail infrastructure, helping to reduce the odds of a train derailment and increasing the uptime of train systems. Agriculture To sustain the growing population, the world needs to produce 70% more food by 2050. At the same time, the average age of a farmer in the United States is 59 and growing, with fewer new entrants to the agricultural labor force each year. The result of these socio-economic factors is a requirement of increased technology and automation on the farm. There are over 900 million acres of farmland in the United States, and automated BVLOS operation is the only scalable way to monitor these acres by drone routinely.GN: Have developers been eager or reticent to work with the FAA? What should manufacturers be doing to help pave the way?Reese Mozer: The relationship between industry and the FAA has been evolving for the past 10 years. Early on, each party was very foreign to the other, with the drone industry being born from Silicon Valley-esque hacker roots and the FAA acting as the 100-year arbiter of manned flight. As a result, many developers either weren’t eager or didn’t understand how to work with the FAA in the early years of the drone industry. Recently, there have been significant and promising changes, but some still persist in that hesitant or unfamiliar mindset. I think an important fact for manufacturers to remember is that the FAA’s job is not to innovate, and it never will be. Their responsibility is to evaluate aviation technologies to assure the safety of the national airspace. If the industry wants to do something new, it is on our shoulders to develop, test, and prove that technology to our regulators.   More

  • in

    Are practical personal exoskeletons finally in reach?

    Wandercraft
    A company that develops self-balancing personal and therapeutic exoskeletons has just closed $45 million equity financing. The series C breathes fresh life into a technology class that’s been much hyped but has struggled to break out of niche markets.Founded in 2012, Wandercraft, based in Paris, has been around for a while, but it’s less known in the U.S. than rival Ekso Bionics, long the marquee player in the space. Like Ekso, Wandercraft was founded to create a mobility device to supplant wheelchairs for people suffering mobility issues. Also like Ekso, Wandercraft narrowed its focus with its first commercial product and is targeting the therapeutic healthcare market (Ekso has since branched into industrial markets, including auto manufacturing). “With the support of patients, medical professionals and the DeepTech community, Wandercraft’s team has created a unique technology that improves rehabilitation care and will soon enable people in wheelchairs to regain autonomy and improve their everyday health, says Matthieu Masselin CEO of Wandercraft.” Wandercraft’s autonomous walking exoskeleton, the first version of which is called Atalante, was commercialized in 2019 and is used by rehabilitation and neurological hospitals in Europe and North America. Atalante provides innovative care for many patients based on realistic, hands-free, crutch-free locomotion. However the company has a more ambitious personal exoskeleton in the works.The focus on the rehab market out of the gate reflects a hard reality for exoskeleton tech. While visions of providing a robust mobility device to those living with mobility issues is inspiring, the fact remains that wheelchairs are an effective, inexpensive, and widely distributed solution to a broad array of mobility issues. By contrast, robotic suits are comparatively expensive and can’t yet match the functionality wheelchairs offer, particularly in accessibility-conscious regions. That makes the market for a mobility-first device elusive, which explains the pivot to therapeutics, where exoskeletons can get wheelchair-bound patients up and walking around, which has tremendous physiological and recovery benefits. None of which is to say, of course, that the therapeutic market can’t serve as an important preliminary toe hold as prices for exoskeleton suits fall and the technology matures. That’s precisely what Wandercraft has done, and it believes the time for a personal device has come.To that end, most of this new round of financing will be used by Wandercraft to fulfil the company’s mission of “mobility for all” through the continued development, then launch, of the new Personal Exoskeleton for outdoor and home use. The funding will also allow Wandercraft to accelerate the deployment of Atalante, its pioneering CE marked rehabilitation exoskeleton, in the USA.

    “We are thrilled to lead this round of financing and to bring together these responsible investors in order to make the world better,” says Alan Quasha, Chairman and CEO of Quadrant Management, which participated in the round. “Wandercraft has developed the world’s most advanced technology in walk robotics and markets the first self-stabilized exoskeletons. We share Wandercraft’s ambition to provide a new solution for mobility, and to improve the health of millions of people using wheelchairs. We believe that they will transform mobility and become the leading player in the market.” Should Wandercraft succeed in successfully marketing a personal exoskeleton, the next few years will be an interesting bellwether for a technology that hasn’t yet lived up to its founding promise. More

  • in

    Robotaxis get new learning strategies to face “the edge”

    Motional
    Taxis are excellent candidates to become the go-to early use case for self-driving cars in the wild. But to get there, autonomous vehicle developers face a daunting challenge: equipping their cars to meet an array of scenarios than can’t be fully anticipated.AI and deep learning tools have been the secret sauce for self-driving car programs, endowing vehicles with the adaptability to face new challenges and learn from them. The most difficult of these challenges are what developer Motional calls “edge situations,” and when the goal is to build a safe robotaxi, identifying, and solving for these outliers is a technological imperative. To do this, Motional has developed its own Continuous Learning Framework, or CLF, that helps its vehicles get smarter with each mile they drive. While Motional’s new IONIQ 5 robotaxi runs on electricity, a spokesperson recently quipped that the CLF is powered by data: terabytes of data are collected every day by Motional’s vehicles mapping cities throughout the U.S. CLF works like a closed-loop flywheel: Each step in the process is important, and completing one step advances the next step forward. As the inflow of data coming from the company’s vehicles grows, the flywheel is expected to turn faster, making it easier to accelerate the pace of learning, solve for edge cases, map new ODDs, and expand into new markets. This machine learning-based system allows Motional to automatically improve performance as they collect more data, and it does this by specifically targeting the rare cases.For a deeper dive into how this CLF process and data helps Motional improve performance, I recently connected with Sammy Omari, VP of Engineering & Head Autonomy.GN: Tell us more about the Continuous Learning Framework (CLF) and why Motional developed it?Sammy Omari: At Motional, we’re developing level 4 robotaxis – autonomous vehicles that do not require a driver at the steering wheel. We will be deploying our robotaxis in major markets through our partnerships with ride-hailing networks. In order to achieve wide-scale level 4 deployments, our vehicles need to be able to recognize and safely navigate the many unpredictable and unusual road scenarios that human drivers also face. To reach this level of sophistication, we’ve developed a Continuous Learning Framework (CLF), which uses machine learning principles to make our AVs more experienced and safer with every mile they drive. Motional’s CLF is a revolutionary machine learning-based system that allows the team to automatically improve performance as we collect more data – and it does this by specifically mind for the rare situations that our vehicles might encounter. The CLF works like a closed-loop flywheel: each step in the process is important, and completing one step advances the next step forward. The entire system is powered by real-world data collected by our vehicles. 

    GN: What type of rare cases or outliers does Motional target through the CLF?Sammy Omari: The vast majority of the time, driving from one point to another is uneventful and relatively mundane. However, occasionally something unusual or “exciting” happens which involves a broad range of rare and unique driving experiences – called edge cases. These edge cases that Motional targets through CLF can include vehicles running red lights or violating the right of way, pedestrians darting into traffic, cyclists carrying surfboards on their backs, racing trikes, and other types of road users or behavior that we don’t encounter every day. GN: How does Motional utilize data gathered to help improve vehicle performance?Sammy Omari: Through the CLF, we’re able to find those rare edge cases in large volumes of data, create training data through automatic and manual data annotation, retain our machine learning models using that data, and then evaluate the updated models. Motional’s Scenario Search Engine allows developers to quickly search Motional’s vast drivelog database so they can introspect and visualize the results in seconds. This scenario query can run every time our autonomous vehicles are on the road collecting data. When we collect a sufficient number of samples and expand our training data, we can then retrain the machine learning models. We’ve built this machine learning-based flywheel that allows us to automatically improve performance as we collect more data – and it does this by specifically targeting the rare edge cases. As the inflow of data coming from our vehicles grows, the flywheel will turn faster, making it easier to accelerate the pace of learning, solve for edge cases, map new ODDs, and expand into new markets.GN: What does this mean for Motional’s future growth?Sammy Omari: Our innovative approach to machine learning helps us create smarter, safer autonomous vehicles that can navigate a wide range of complex environments. This allows us to deploy our vehicles in new markets faster, which ultimately will improve road safety more broadly.  More