More stories

  • in

    Are consumers ready for drone delivery? Maybe, but will they pay for it?

    Amazon With news that Amazon has officially entered the drone delivery game, it might feel like adoption is inevitable. But what do consumers think of that possibility? A new survey polling more than 1,000 consumers across the U.S. sought to answer exactly that question, as well as to extract consumer sentiment about the coming of […] More

  • in

    Google once again delays phasing out third-party cookies

    Google is once again delaying the phaseout of third-party cookies in Chrome. The browser will now fully support the tracking technology until the second half of 2024, Google said on Wednesday. In 2021, Google originally committed, to ending third-party cookie support within the Chrome browser in 2022. The commitment came two years after Google began working on its “privacy sandbox” for Chrome. 
    ZDNet Recommends
    The efforts were slow and met some setbacks along the way. In a blog post Wednesday, Google’s Privacy Sandbox VP Anthony Chavez said the company has taken a “deliberate approach to transitioning from third-party cookies.” He added, “The most consistent feedback we’ve received is the need more time to evaluate and test the new Privacy Sandbox technologies before deprecating third-party cookies in Chrome.” Google has been working on this for years in response to growing consumer concerns about privacy. Typically, businesses have relied on third-party cookies and data aggregators to assess the behavior of their users across multiple domains. This, however, clearly came at the cost of customer privacy. Consequently, companies like Google and Apple began restricting the use of third-party cookies. Google’s first attempt to replace the third-party cookie with its own technology, called FLoC, was met with staunch opposition from some, a wary eye from others, and very little positive feedback. Given this poor reception, Google subsequently said Chrome would continue supporting third-party cookies until at least mid-2023. Then in January of this year, it rolled out the Topics API, which aims to track users anonymously while still giving advertisers enough data for interest-based ads. Beginning in early August, the Privacy Sandbox trials will expand to millions of users globally. Google will gradually increase the trial population throughout the rest of the year and into 2023. The company expects the Privacy Sandbox APIs to be launched and generally available in Chrome by Q3 2023. Then it will begin phasing out third-party cookies in Chrome during the second half of 2024. More

  • in

    Race against time: Hackers start hunting for victims just 15 minutes after a bug is disclosed

    Image: Sergey Nivens/Shutterstock Attackers are becoming faster at exploiting previously undisclosed zero-day flaws, according to Palo Alto Networks.  The company warns in its 2022 report covering 600 incident response (IR) cases that attackers typically start scanning for vulnerabilities within 15 minutes of one being announced. Among this group are 2021’s most significant flaws, including the […] More

  • in

    These ransomware hackers gave up when they hit multi-factor authentication

    Image: Getty A ransomware attack was prevented just because the intended victim was using multi-factor authentication (MFA) and the attackers decided it wasn’t worth the effort to attempt to bypass it.  It’s often said that using MFA, also known as two-factor authentication (2FA), is one of the best things you can do to help protect your accounts […] More

  • in

    Mushroom meat and robot chefs: Chipotle's vision for the future of fast food

    Chipotle This past spring, Chipotle announced the launch of a tech-focused venture fund to nurture food and service-oriented technologies. The first round of investments just went out, and the news says a lot about the future of quick service. Through its Cultivate Next venture fund, Chipotle is backing Hyphen, a food service platform designed to help automate kitchens […] More

  • in

    Microsoft warns of stealthy backdoors used to target Exchange Servers

    Image: Getty/10’000 Hours There’s been an uptick in malware native to Microsoft’s Internet Information Services (IIS) web server that is being used to install backdoors or steal credentials and is hard to detect, warns Microsoft.  Microsoft has offered insights into how to spot and remove malicious IIS extensions, which aren’t as popular as web shells […] More

  • in

    Data breach costs record $4.3M with firms passing buck to customers

    The average cost of a data security breach has hit another record-high of $4.35 million per incident, growing 12.7% over the past two years. And some businesses are passing the buck to customers, even as the cost of products and services has climbed amidst inflation and supply chain constraints. This year’s figure was up 2.6% from last year’s $4.24 million per breach, according to IBM’s 2022 Cost of Data Breach report, which further revealed that 83% of companies surveyed had experienced more than one data breach. Conducted by Ponemon Institute, the report analysed 550 organisations across 17 global markets that were impacted by data breaches between March 2021 and March 2022.Just 17% said this was their first breach. In addition, 60% said they increased the price tag on their products and services due to losses suffered from the data breach. They also continued to chalk up losses long after the breach, where almost half of such costs were incurred more than a year after the incident. Organisations in the US saw the highest average cost of a breach, which climbed 4.3% to $9.44 million, followed by the Middle East region where the average cost clocked at $7.46 million this year, up from $6.93 million in 2021. Canada, the UK, and Germany rounded up the top five pack, chalking at average losses of $5.64 million, $5.05 million, and $4.85 million per breach, respectively. Six markets, including Japan, South Korea, and France, amongst the 17 markets analysed saw a dip in their respective average breach cost. Supply chains, user credentials fuel attacksAcross the board, companies took an average of 207 days to identify the breach and 70 days to contain it, down overall from last year’s average of 212 days to identify and 75 days to contain the breach.Some 19% of breaches were the result of supply chain attacks, costing an average $4.46 million and clocking a lifecycle of 26 days longer than the global average of 277 days, which measured the combined time to identify and contain a data breach. Supply chain breaches were due to a business partner being the initial point of compromise. Human errors, which encompassed negligent actions of employees or external contractors, accounted for 21% of incidents, while IT failures–the result of disruption or failure in a company’s IT systems that led to data loss–were behind 24% of breaches. The latter included errors in source codes or process failures, such as automated communication errors. Some 11% of breaches were ransomware attacks, up from 7.8% last year and at a growth rate of 41%, but the average cost of such attacks dropped slightly to $4.54 million from $4.62 million in 2021. Attacks from stolen or compromised credentials remained the most common cause of a data breach, accounting for 19% of all incidents this year, the report found. Breaches from stolen or compromised credentials cost an average $4.5 million per incident and had the longest lifecycle of 243 days to identify and 84 days to contain the breach. Phishing was the second-most common cause of a data breach, accounting for 16% for overall attacks, but the costliest with an average $4.91 million in losses. Amongst sectors, healthcare suffered a record-high average breach cost of $10.1 million, up almost $1 million from 2021 and sealing its ranking as the most expensive industry. In fact, the sector’s breach costs had climbed 41.6% since 2020. The financial services sector recorded the second-highest average breach cost of $5.97 million, followed by pharmaceuticals, technology, and energy at $5.01 million, $4.97 million, and $4.72 million, respectively. The average breach cost for organisations running critical infrastructures was $4.82 million, which was $1 million more than the average cost for organisations in other sectors. Critical infrastructure companies were from sectors that included financial services, energy, transport, healthcare, and government. Amongst these organisations, 28% experienced a destructive or ransomware attack and 17% pointed to a compromised supply chain partner. Mitigating losses with touted security strategiesThe IBM study also studied differences in the impact of a data breach amongst companies that had and had not adopted security strategies and technologies, such as zero trust, extended detection and response (XDR), and artificial intelligence (AI). The report noted that nearly 80% of critical infrastructure organisations without a zero trust strategy saw a higher average breach cost of $5.4 million, or $1.17 million more than those that adopted zero trust frameworks. Across the board, 41% of organisations said they had deployed a zero trust security framework, up from 35% last year, and the remaining 59% had not done likewise. In addition, those that deployed security AI and automation tools saw lower breach costs that were $3.05 million lower than their peers that did not implement any of such tools. They also took 74 days longer to identify and contain a breach than those that adopted security AI and automation technologies. The number of organisations that used such tools hit 70% this year, up from 59% in 2020. In addition, 43% of companies that were in the early stages or had yet to deploy security practices across their cloud platforms saw higher losses of at least $660,000 on average than those that had mature cloud security environments. Some 44% of breaches in the study happened in the cloud, with those occurring in a hybrid cloud environment costing an average $3.8 million, compared to $4.24 million for breaches in private clouds and $5.02 million in public clouds. At $4.99 million per incident, remote work-related breaches also cost almost $1 million more on average than breaches where remote work was not a factor. Some 44% of companies had implemented XDR technologies and they saw shorter breach lifecycles of about a month, on average, compared to their peers that had not deployed such tools who took 304 days to identify and contain a breach. Amongst organisations that suffered ransomware attacks, those that paid up clocked $610,000 lower breach costs–excluding cost of ransom–compared to those that chose not to pay.In addition, 62% of companies that said they were insufficiently staffed to support their cybersecurity needs saw an average $550,000 higher breach costs than those that were adequately staffed.RELATED COVERAGE More

  • in

    Tech giants, including Meta, Google, and Amazon, want to put an end to leap seconds

    In her hit song, Cher sang, “If I could turn back time.” For her, that would be a good thing. But in the computing world, Meta, formerly Facebook, believes it would be a very bad thing indeed. In fact, Meta wants to get rid of leap seconds, which keep computing time in sync with Earth’s rotational time. Meta’s not the only one that feels that way. The US National Institute of Standards and Technology (NIST), its French equivalent (the Bureau International de Poids et Mesures or BIPM), Amazon, Google, and Microsoft all want to put an end to leap seconds. 
    ZDNet Recommends
    Why? As Meta explained in a blog post, “We bump into problems whenever a leap second is introduced. And because it’s such a rare event, it devastates the community every time it happens. With a growing demand for clock precision across all industries, the leap second is now causing more damage than good, resulting in disturbances and outages.” Therefore, Meta concludes, we should simply “stop the future introduction of leap seconds.” Computers require accurate timekeeping for pretty much everything they do — security, identification, networks, and more. Some systems rely on Global Positioning Systems (GPS) appliances and the GPSD daemon to tell the exact time.The problem is Earth’s rotational time is not absolute; Earth’s spin speed varies in response to geological events. For example, Earth’s ice caps and ice-topped mountains are constantly melting and refreezing, affecting the Earth’s rotation’s angular velocity. This, in turn, slows down and speeds up our days. The International Earth Rotation and Reference Systems Service (IERS) tracks this, and every few years, it adds a leap-second to the year. This is done to Coordinated Universal Time (UTC), which is the standard universal time system. Why do we have leap seconds? In 1972, the idea of leap seconds was introduced by IERS. This periodical UTC update kept computer time in sync with observed solar time (UT1) and the long-term slowdown in the Earth’s rotation. That made astronomers and navigators happy — programmers and IT administrators, not so much. UTC is used by the internet’s Network Time Protocol (NTP) to set the time. For its part, NTP keeps all of our internet-connected devices in sync with each other.How does NTP know what time it is? By synchronizing NTP servers with atomic clocks. NTP is based on a hierarchy of levels, where each level is assigned a number called the stratum. Stratum 1 (primary) servers at the lowest level are directly synchronized to national time services via satellite, radio, or modem. Stratum 2 (secondary) servers are synchronized to stratum 1 servers and so on. Usually, NTP clients and servers connect to Stratum 2 servers. So far, so good, but how do stratum 1 servers sync up with clocks? Many of them use GPSD. This service daemon monitors one or more GPSs for location, course, velocity, and — for our purposes — time. The problem is that this system is complicated and prone to failure. If you’re a system or network administrator, you already know this. Meta’s researchers insist that “introducing new leap seconds is a risky practice that does more harm than good, and we believe it is time to introduce new technologies to replace it.”In the past, leap seconds have crashed programs or even corrupted data, due to weird data storage timestamps. For example, both Reddit and CloudFlare have had nasty outages due to leap seconds. As Linux founder Linus Torvalds said in response to the problem that tripped up Reddit, “Almost every time we have a leap second, we find something. It’s really annoying, because it’s a classic case of code that is basically never run, and thus not tested by users under their normal conditions.”Worse still, Meta points out that “with the Earth’s rotation pattern changing, it’s very likely that we will get a negative leap second at some point in the future. The timestamp will then look like this: 23:59:58 – > 00:00:00.”What happens then? We don’t know. “The impact of a negative leap second has never been tested on a large scale; it could have a devastating effect on the software relying on timers or schedulers.” Meta explains, “With a growing demand for clock precision across all industries, the leap second is now causing more damage than good, resulting in disturbances and outages.” Therefore, we should simply “stop the future introduction of leap seconds.”Period. End of statement. And if our computing clocks don’t agree with the stars above us? That’s a problem for astronomy application developers, not the rest of us.Eventually, we’ll need to change the clocks again. After all, the lack of leap days eventually led to our losing 10 days when we switched from the Julian to the Gregorian calendar in 1752. But Meta thinks that we’ll do just fine for the next thousand years or so without any more leap seconds.  More