More stories

  • in

    I test a lot of AI coding tools, and this stunning new OpenAI release just saved me days of work

    David Gewirtz / Elyse Betters Picaro / ZDNETLast week, OpenAI quietly dropped a programming bombshell post on X/Twitter. It turns out you can now connect GitHub repos to Deep Research in ChatGPT. What makes this particularly interesting is that you can put ChatGPT Deep Research to work scanning that repo for all sorts of yummy nuggets of information. GitHub is an online resource owned by Microsoft that holds an enormous number of programming projects, both open source and proprietary. It’s used by teams to coordinate and track development. A GitHub repo is a repository for a given project. What Deep Research can now do is dig through the full source code for an entire software project and provide value. Also: The best AI for coding in 2025 (including two new top picks – and what not to use)Look, I have to come clean before we go on any further. I’ve tried this thing out. It can effectively produce an internal code review. This is very cool. I am seriously chuffed by what we can get this beastie to do for us. AIs creep me out, so why do I want to hug this one? My editor says, “That’s all part of their plan.” Seriously, this is some powerful mojo. Most of my biggest programming projects have been based on code acquisitions from other programmers. Digging through their code and deciphering what the living heck they were thinking throughout the code can be an excruciatingly time-consuming process. It’s not just about a line-for-line what-does-this-code-do sort of thing. It’s about how the project is architected. What are the various modules, and how do they interact? Where are their strengths, and where are their potent gotchas? What makes the UI function? What needs to be done to add functionality, remove functionality, or replace functionality? Also: How I used this AI tool to build an app with just one prompt – and you can tooDeep Research can now do this with GitHub repos. It is almost totally sweet. Almost is because of the restrictions. Let’s cover those first, before I show you the hoops I made this thing jump through. According to ChatGPT 4o, here are the limits for each of the subscription tiers. More

  • in

    Your iPhone is getting these useful features with iOS 19 – including a big one for multitaskers

    Kerry Wan/ZDNETIf today’s news is any hint of things to come, Apple’s Worldwide Developers Conference 2025 should be packed with notable announcements. In advance of Global Accessibility Awareness Day on May 16 and three weeks ahead of WWDC 2025, Apple is previewing more than a dozen accessibility features across its product line that will be available in iOS 19. And these are significant upgrades, from a Magnifier for the Mac to Accessibility Nutrition Labels in the App Store.Magnifier for Mac  The Magnifier for Mac will work with Continuity Camera on the iPhone as well as with attached USB cameras. The Magnifier app will connect to your camera so you can zoom in on your surroundings. For instance, you can point a document to your Mac’s camera and read easily on a bigger screen. The app will allow you to add customized views, adjust brightness, contrast, and more. You can also save your Views or group them for a more organized look. Also: Apple’s WWDC 2025: What to expect from iOS 19, VisionOS 3, and moreAccessibility Nutrition Labels  Apple will introduce a new Accessibility Nutrition Labels section to App Store product pages. These labels will highlight accessibility features within apps and games to better inform users about their downloads. Another new feature, Braille Access, will add a full-featured braille note taker that the company describes as “deeply integrated into the Apple ecosystem.” You’ll be able to take notes in braille format and perform calculations using Nemeth Braille. Also: Apple’s Meta Ray-Bans killer is only one of four major launches in 2027 – here’s the listAnother impressive new feature is Switch Control for Brain Computer Interfaces (BCIs), which will let you control your devices without any physical movement. More accessibility featuresHere’s a rundown of 12 more accessibility features coming to your Apple devices:Accessibility Reader will debut as a new systemwide reading mode on iPhone, iPad, Mac, and Apple Vision Pro. It is designed to make text easier to read for people with a wide range of disabilities, like dyslexia or low vision. Users will be able to customize text font, color, contrast, and more to make the content more readableLive Captions on Apple Watch will help those who are deaf or hard of hearing. This feature will turn their iPhone into a remote microphone to stream content directly to AirPods, Made for iPhone hearing aids, or Beats headphones.Enhanced View on Vision Pro will expand everything in view, including the surroundings. It is built for those who struggle with low vision. Personal Voice will create a voice in under a minute using 10 phrases, thanks to on-device machine learning and AI.Background Sounds will let you easily personalize with new EQ settings to help minimize distractions and increase focus.Vehicle Motion Cues, a feature made to reduce motion sickness, is now coming to Mac. Eye Tracking will enable iPhone and iPad users to use a switch or Dwell to make selections.Head Tracking improvements will allow you to control your iPhone and iPad more easily.Assistive Access will be added to the Apple TV app to create a better experience for users with intellectual and developmental disabilities.Also: Buying an iPhone 18 next year may look a little different – and why you should be excitedName Recognition will be added to Sound Recognition so users who are deaf or hard of hearing can hear when their name is called. Sound Recognition will also be part of CarPlay, so drivers or passengers who struggle with hearing can now be notified of the sound of a crying baby.Voice Control and Live Captions will expand to add support for more languages.Apple will allow users to share their accessibility settings with another iPhone or iPad. More

  • in

    Windows 10 and Microsoft 365 support deadlines didn’t change – why this story just won’t die

    ZDNETHere we go again. A zombie news story that should have been laid to rest last January has risen from the grave and is walking among us again.The original storyIn case you missed the original story, here’s a recap: Last January, dozens of tech-focused news sites reported that the free upgrade from Windows 10 to Windows 11 was “for a limited time only.” In a quote from the same source, they warned that Microsoft had decreed you would need to upgrade to Windows 11 to continue using Microsoft 365 apps on your PC after the Oct. 14, 2025, end-of-support deadline for Windows 10. Also: How to upgrade your ‘incompatible’ Windows 10 PC to Windows 11 – 2 free optionsThe problem with all those reports is that they were based on an article by a very junior Microsoft employee posted on an obscure blog for Microsoft nonprofit customers. It wasn’t an official announcement, and the post was deleted that same day. A Microsoft spokesperson told ZDNET’s sister publication PCMag that the blog post “contained inaccurate information and a misleading headline.” Microsoft’s official support document, “What Windows end of support means for Office and Microsoft 365,” had been published a month earlier and was much less alarming. It begins: “Microsoft 365 apps will no longer be supported on Windows 10 after it reaches end of support on October 14, 2025.” That statement is repeated in bold later in the document: Support for Windows 10 will end on October 14, 2025. After that date, if you’re running Microsoft 365 Apps on a Windows 10 device, the applications will continue to function as before. However, we strongly recommend upgrading to Windows 11 to avoid performance and reliability issues over time. Back in the newsSo why did this zombie story start appearing in my news feeds today? I blame Forbes. They’re the ones standing there, shovel in hand, shouting about “Microsoft’s surprise deadline u-turn” while continuing to quote from the inaccurate, long-since-deleted zombie blog post. Also: Is your Microsoft account passwordless yet? Why it (probably) should be and how to do it rightThis week’s fuss is based on a newly published page at Microsoft’s product documentation site, Microsoft Learn: “Windows 10 end of support and Microsoft 365 Apps,” which contains this note: To help maintain security while you transition to Windows 11, Microsoft will continue providing security updates for Microsoft 365 Apps on Windows 10 for three years after Windows 10 reaches end of support. These updates will be delivered through the standard update channels, ending on October 10, 2028. That shouldn’t be a surprise. The three-year continuation in security updates for Microsoft 365 matches the Windows 10 Extended Security Updates available to Microsoft’s enterprise customers. It would be a nightmare to rebuild the Microsoft 365 update servers so they delivered updates only to PCs running Windows 10 with an ESU subscription while blocking other Windows 10 devices. So everyone gets those updates. More

  • in

    LastPass can now monitor employees’ rogue reliance on shadow SaaS – including AI tools

    Petri Oeschger/Getty Images With LastPass’s browser extension for password management already well-positioned to observe — and even restrict — employee web usage, the security company has announced that it’s diversifying into SaaS monitoring for small to midsize enterprises (SMEs).  SaaS monitoring is part of a larger technology category known as SaaS Identity and Access Management, or SaaS […] More

  • in

    I replaced my Bose Ultra Open with these Shokz earbuds – they’re an even better deal

    <!–> ZDNET’s key takeaways The Shokz OpenDots One are the company’s first clip-on earbuds, available in Black and Grey for $199. They sport a comfortable, nondescript design with Shokz’s industry-leading bone conduction audio technology. However, the earbuds’ touch controls are unreliable and awkward to use. –> I used to not be a Shokz believer; the […] More

  • in

    Google is upgrading Android Auto in 5 useful ways – including a big one for voice inputs

    Elyse Betters Picaro / ZDNETGemini is hitting the road.According to Google, there are over 250 million cars that support Android Auto today, and more than 50 car models with Google built directly into the infotainment system. Those vehicles are about to get a lot smarter, as Gemini is coming to Android Auto. Also: Finally, I found an Android Auto adapter that’s highly functional, lag-free, and priced wellGemini for Android Auto will work the same as it does everywhere else, meaning you’ll have access to conversations, insight, helpful advice, and more. Here’s a look at just a few of the unique ways you can use Gemini on the go in Android Auto.1. Find the perfect restaurantIf hunger hits while you’re driving, Gemini can help you find what you’re looking for. Just ask Gemini to “find great burger places along the way,” and you’ll get a list of restaurants that specialize in them. You can ask for insight from reviews or ask common questions like what the hours are. Once you’ve decided, Gemini can connect to Google Maps to get you there. More

  • in

    I’m a diehard Pixel user, and the Android 16 design overhaul has me more excited than ever

    Kerry Wan/ZDNETAndroid 16 is coming, and with it, we’ll see a considerable refresh on the UI front. It’s been four years since Material You was first released, and the latest iteration (version 3) looks to include some features and improvements that many have been hoping for.Also: Your Android devices are getting a major Gemini upgrade – cars and watches includedFrom animations to notifications, everything in the Android UI looks like it will see serious improvement. Here’s what’s coming to this refreshed UI.1. Animations with a little more pepFrom the demos I’ve seen, animations (across the board) are springier and more responsive. What’s really nice about this change is that it almost feels more organic. For example, when you drag an animation, the adjoining animations subtly react to the movement. This will also help make Android components more responsive, at least visually so.2. Blurs and shading More