More stories

  • in

    Google embeds AI agents deep into its data stack – here’s what they can do for you

    Joan Cros/NurPhoto via Getty Images ZDNET’s key takeaways Google is introducing powerful tech for agents and data. They are also introducing a series of data-centric agents. A new command-line AI coding tool is now available. I am no stranger to hyperbolic claims from tech companies. Anyone who’s on the receiving end of a firehose of […] More

  • in

    Knowing these 7 rules helped me optimize my home security camera for the best footage

    Maria Diaz/ZDNETIf you’re a subscriber to the Nextdoor app, you’ve seen plenty of footage of prowling ne’er-do-wells caught in the eye of a video doorbell or home security camera. Hopefully, you don’t have your own first-hand experience with suspicious characters milling around your front porch, or far worse.Also: Unplugging these 7 common household devices helped reduce my electricity billsFor good reason, security cameras continue to grow in popularity, and we can expect them to become even more reliable and affordable this year. While we don’t necessarily require super high-res imagery from these discrete little devices, you can get the most out of them by being mindful of a few factors — especially where you position your camera(s). 1. Avoid obstructions (even future obstructions)Obviously, you won’t be putting a lens behind anything that blocks its view. Sometimes, though, that can include objects that change in size or shape over time, like trees and shrubs. During wintertime, a clear view of your yard may become a different story when branches bloom with new foliage in the spring.The same can apply to interior views, at least with objects that come and go. Will shutting a door somewhere within your camera’s line of sight block out a good percentage of its field of vision? Will your pet cat find a favorite spot to curl up in for hours at a time, right in front of the device? More

  • in

    Why I recommend this $2,000 mirrorless camera to both beginners and professionals

    <!–> ZDNET’s key takeaways Sigma’s BF is a $2,200, 35mm full-frame, mirrorless digital camera that radically changes the mode of operation by replacing the gaggle of buttons with an elegant click-wheel. It’s a great first camera but also has tons of pro features A future upgrade to a higher-resolution 60-megapixel sensor would be a welcome […] More

  • in

    Google is building a Linux terminal app for native Android development – here’s why that’s huge

    Jack Wallen/ZDNET ZDNET’s key takeaways Google is developing another Linux terminal app.The app runs a full Debian environment.Developers will be able to build Android apps on device.For some time, Android has had access to a terminal app that ran a full-blown, text-only Linux environment. This app is enabled via Android’s developer options feature and makes it possible for users to run Linux commands (even SSH).From Google’s perspective, that wasn’t enough. Also: 5 Linux terminal apps better than your defaultBut it seemed a bit odd when they went mum on the Linux Terminal app at the annual I/O developer conference. Even with that silence, a bit of new news has surfaced that points to Google releasing a new take on the Linux terminal app, one that targets developers. Build directly on Android devicesThis new Linux terminal app will allow developers to build Android apps directly on Android devices. That’s a huge change from having to build on emulators running on top of a desktop OS. Also: 5 surprisingly productive things you can do with the Linux terminalThe new Linux terminal app uses the Android Virtualization Framework to boot into a Debian image running on a virtual machine. This new terminal app provides a full-blown Linux development environment that allows developers to leverage the tools they need to build native apps. That includes Android Studio. More

  • in

    My go-to LLM tool just dropped a super simple Mac and PC app for local AI – why you should try it

    Jack Wallen / Elyse Betters Picaro / ZDNETZDNET’s key takeawaysOllama AI devs have released a native GUI for MacOS and Windows.The new GUI greatly simplifies using AI locally.The app is easy to install, and allows you to pull different LLMs.If you use AI, there are several reasons why you would want to work with it locally instead of from the cloud. First, it offers much more privacy. When using a Large Language Model (LLM) in the cloud, you never know if your queries or results are being tracked or even saved by a third party. Also, using an LLM locally saves energy. The amount of energy required to use a cloud-based LLM is growing and could be a problem in the future.Ergo, locally hosted LLMs.Also: How to run DeepSeek AI locally to protect your privacy – 2 easy waysOllama is a tool that allows you to run different LLMs. I’ve been using it for some time and have found it to simplify the process of downloading and using various models. Although it does require serious system resources (you wouldn’t want to use it on an aging machine), it does run fast, and allows you to use different models. But Ollama by itself has been a command-line-only affair. There are some third-party GUIs (such as Msty, which has been my go-to). Until now, the developers behind Ollama hadn’t produced their own GUI.That all changed recently, and there’s now a straightforward, user-friendly GUI, aptly named Ollama.Works with common LLMs – but you can pull othersThe GUI is fairly basic, but it’s designed so that anyone can jump in right away and start using it. There is also a short list of LLMs that can easily be pulled from the LLM drop-down list. Those models are fairly common (such as the Gemma, DeepSeek, and Qwen models). Select one of those models, and the Ollama GUI will pull it for you. If you want to use a model not listed, you would have to pull it from the command line like so: ollama pull MODELWhere MODEL is the name of the model you want. Also: How I feed my files to a local AI for better, more relevant responsesYou can find a full list of available models in the Ollama Library. After you’ve pulled a model, it appears in the drop-down to the right of the query bar. The Ollama app is as easy to use as any cloud-based AI interface on the market, and it’s free to use for MacOS and Windows (sadly, there’s no Linux version of the GUI). I’ve kicked the tires of the Ollama app and found that, although it doesn’t have quite the feature set of Msty, it’s easier to use and fits in better with the MacOS aesthetic. The Ollama app also seems to be a bit faster than Msty (in both opening and responding to queries), which is a good thing because local AI can often be a bit slow (due to a lack of system resources). More

  • in

    This palm-sized power bank can charge multiple devices at once – and I’m all for the price

    Voltme Hypercore 10K power bank <!–> ZDNET’s key takeaways Voltme’s Hypercore 10K power bank is available on Amazon for $23. It’s very compact and easy to carry, with both USB-C and USB-A ports to charge two devices simultaneously. Its small but chunky, as some might prefer a flatter charger. –> Power banks are a tradeoff: […] More