More stories

  • in

    AI agents will change work and society in internet-sized ways, says AWS VP

    AWS Summit 2025 at the Javits Center in NYC.  Sabrina Ortiz/ZDNETForget the old Apple slogan, “Think different.” For Deepak Singh, VP of developer agents and experiences at AWS, the mantra of the future is “work differently,” and the way he wants to do that is through agentic AI.”I think people get too hung up on the automation and efficiency, part of which are outcomes,” said Singh. “We are working differently, but the way we are working different is making us more effective because [agents are] solving harder problems or more problems than you could do before.”Also: AWS aims to be your one-stop-shop for AI agents from Anthropic, IBM, Perplexity, and othersSingh sat down with ZDNET on Wednesday, shortly after AWS introduced a bevy of new tools and features centered around agentic AI solutions. Among the biggest announcements were Amazon Bedrock AgentCore, a new enterprise-grade platform designed to facilitate the implementation process for new agents, and a new virtual store within AWS Marketplace, which allows customers to choose agents from Anthropic, IBM, Perplexity, Salesforce, and other vendors.At the core of the announcements is the ability to make organizations more easily adopt, customize, and deploy AI agents in their organization. This ease of access to AI agents inherently means the technology will be more rapidly deployed, and that the way people work will be transformed rapidly — but Amazon postulates it’s for the better. Work smart Singh, whose work focuses on building experiences that optimize how developers build software, told ZDNET that agentic AI offers workers of all levels the opportunity to build more efficiently.For example, Singh said a software developer intern could spend more time learning how the system works instead of learning the intricacies of a new programming language. Ultimately, a better understanding of the system, facilitated through interactions with AI agents, can help the intern develop the project they are working on.  More

  • in

    I tested the Ferrari of robot mowers for a month – here’s my verdict

    <!–> ZDNET’s key takeaways The Mammotion Luba 2 3000H is available for $2,599. Built to handle uneven terrain, the Luba 2 is an all-wheel-drive (AWD) reliable robot mower with a GPS-powered perimeter that is surprisingly easy to set up. The Mammotion app is not very user-friendly and can get buggy after a firmware update. –> […] More

  • in

    How I started my own LinkedIn newsletter for free – in 5 easy steps

    David Gewirtz / Elyse Betters Picaro / ZDNETIt’s been almost exactly two years since I launched my weekly Advanced Geekery email newsletter on Substack. Each week, I list my latest ZDNET articles, showcase any new videos I put out, sometimes spotlight projects I’m working on (and those of readers), and share a few great YouTube videos and articles worth reading.The newsletter is a great way for those who like my work to keep up with what I produce. Recently, I’ve started getting requests for a newsletter on LinkedIn from members who are much more LinkedIn-centric. Ever since Twitter took its wacky dive off the credibility cliff, LinkedIn has been picking up the slack in terms of professional and work-related social networking. Also: LinkedIn is making it easier to understand the full impact of your posts – here’s howAs it turns out, starting a LinkedIn newsletter is both easy and free. You don’t need a LinkedIn Premium account. LinkedIn will notify your network when you publish the first edition of your newsletter, and it will also invite new followers to subscribe. Each issue you put out will be shared to your feed. Plus, anyone who signs up as a subscriber will get an email notification in their inbox. I went ahead and set up Advanced Geekery on LinkedIn. The two editions (Substack and LinkedIn) are basically identical. I now write and edit each issue on Substack, then selectively copy and paste the content over to a new LinkedIn newsletter article. The newsletter on LinkedIn has a little less formatting control than the one on Substack, but it still looks pretty good. I can copy the text (with included links) from the Substack editor, but I have to add any pictures to the LinkedIn version manually. It seems to add about 15 minutes to my workflow, which is a small amount of effort to reach a different audience. How to create a LinkedIn newsletterIn this article, I’ll take you through the step-by-step process I used to set up my LinkedIn newsletter. If you want to set one up for Substack, I documented that as well. More

  • in

    Adobe Firefly can now generate AI sound effects for videos – and I’m seriously impressed

    Adobe / Elyse Betters Picaro / ZDNETJust a year and a half ago, the latest and greatest of Adobe’s Firefly generative AI offerings involved producing high-quality images from text with customization options, such as reference images. Since then, Adobe has pivoted into text-to-video generation and is now adding a slew of features to make it even more competitive.Also: Forget Sora: Adobe launches ‘commercially safe’ AI video generator. How to try itOn Thursday, Adobe released a series of upgrades to its video capabilities that give users more control over the final generation, more options to create the video, and even more modalities to create. Even though creating realistic AI-generated videos is an impressive feat that shows how far AI generation has come, one crucial aspect of video generation has been missing: sound. Adobe’s new release seeks to give creative professionals the ability to use AI to create audio, too. Generate sound effects The new Generate Sound Effects (beta) allows users to create custom sounds by inserting a text description of what they’d like generated. If users want even more control over what is generated, they can also use their voice to demonstrate the cadence or timing, and the intensity they’d like the generated sound to follow. For example, if you want to generate the sound of a lion roar, but want it to match when the subject of your video is opening and closing its mouth, you can watch the video, record a clip of you making the noise to match the character’s movement, and then accompany it with a text prompt that describes the sound you’d like created. You’ll then be given multiple options to choose from and can pick the one that best matches the project’s vibe you were going for. Also: Adobe Firefly now generates AI images with OpenAI, Google, and Flux models – how to access themWhile other video-generating models like Veo 3 can generate video with audio from text, what really stood out about this feature is the amount of control users have when inputting their own audio. Before launch, I had the opportunity to watch a live demo of the feature in action. It was truly impressive to see how well the generated audio matched the input audio’s flow, while also incorporating the text prompt to create a sound that actually sounded like the intended output — no shade to the lovely demoer who did his best to sound like a lion roaring into the mic. Generate visual avatars Another feature launching in beta is Text to Avatar, which, as the name implies, allows users to turn scripts into avatar-led videos, or videos that look like a live person reading the script. When picking an avatar, you can browse through the library of avatars, pick a custom background and accents, and then Firefly creates the final output. More

  • in

    This hidden Google Maps feature is making people emotional – here’s why

    When did Google Maps Street View add See More Dates?Google introduced the “See More Dates” feature for Street View on desktop in 2014, allowing users to access historical imagery of various locations. In May 2022, to commemorate Street View’s 15th anniversary, Google expanded this feature to its mobile apps, enabling users to view past imagery on both Android and iOS devices. How far back does Google Maps Street View go in time?Google Street View imagery dates back to May 2007, when the service was first introduced in select US cities. Also: Ready to ditch Google Maps? My new favorite map app won’t track you or drain your battery – and it’s freeCan’t see your home in Google Maps Street View’s history?The availability of historical imagery varies by location, depending on when Google captured images in that area. In some places, you can view imagery spanning over a decade, while in others, the available history may be shorter.Historical Street View imagery of your house may be unavailable due to limited data collection in your area, privacy requests leading to image removal or blurring, technical issues causing missing images, or location on private roads restricting access. Why is this Google Maps Street View feature trending?While Google’s “See More Dates” feature has been around for years, TikTok appears to have given it new life. You can see an example here. This trend of “going back in time” on Google Maps is as heartwarming as it is nostalgic. Thanks to Google’s Street View timeline, you can see how your home and neighborhood have changed over the years. If you’re lucky, you might even spot a familiar face waving back at you — frozen in time but forever remembered.Editor’s note: This article was originally published in December 2024. It was thoroughly updated, fact-checked, and republished on July 17, 2025.Get the morning’s top stories in your inbox each day with our Tech Today newsletter. More