More stories

  • in

    Growing reliance on third-party suppliers signals increasing security risks

    Adversaries are turning their focus on cheaper, easier targets within an organisation’s supply chain, especially as businesses increasingly acquire software from external suppliers. In this first piece of a two-part feature, ZDNet looks at how organisations in Asia-Pacific are facing more risks even as the perimeter they need to protect extends far beyond their own networks.There had been a spate of third-party cybersecurity attacks since the start of the year, with several businesses in Singapore and across Asia impacted by the rippling effects of such breaches. Just last month, personal details of 30,000 individuals in Singapore might have been illegally accessed following a breach that targeted a third-party vendor of job-matching organisation, Employment and Employability Institute (e2i). Earlier this year, personal data of 580,000 Singapore Airlines (SIA) frequent flyers as well as 129,000 Singtel customers also were compromised through third-party security breaches.

    That Singtel and SIA had been compromised through such attacks did not come as a surprise to Benjamin Ang, senior fellow of cyber homeland defence and deputy head of Centre of Excellence for National Security (CENS). Established in April 2006, CENS is a research unit of the Nanyang Technological University’s S. Rajaratnam School of International Studies and consists of local and overseas analysts specialising in national and homeland security issues. Ang told ZDNet in a video call that the IT ecosystem had been built for efficiencies and speed of deployment. To do this in software development, libraries or DLL (Dynamic Link Libraries) had to be established so data could be pulled from different places. Enterprises also did not build every application on their own, choosing instead to acquire software from external suppliers. “And whoever they acquire from has their own software development system that we have to trust they are securing,” he noted. Cheaper, easier targets within supply chains

    CyberGRX’s chief information security officer (CISO) Dave Stapleton also pointed to an increasing dependence on third-party products over the past 15 years, with businesses outsourcing their operations to achieve economies of scale and access specialised products. It then would make sense for adversaries to target secondary targets, rather than their primary one, to breach a network, said Stapleton in a video call.   He noted that recent attacks also had appeared indiscriminate straying away from the more targeted and direct nature of APT (advanced persistent threat) attacks, which had gained in popularity over the past few years. This seemed to be the case for the Microsoft Exchange Server hack, where hackers adopted a scatter approach to expose thousands of companies that might not be the main target. Stapleton said more organisations would face a challenge should such indiscriminate supply chain attacks become more popular. Impact would be more widespread, especially as pivotal third-party applications used by millions worldwide were targeted and breached, as was the case with SolarWinds, he said. Noting that third-party attacks were not new, he said: “What we’re seeing now is a shift in mindset and strategy of threat attacks to focus more on these pivotal third parties that have links to supply chains. And from the attacker’s perspective, compromising a third party can be a cheaper and easier entry point to [breach a] primary target.” They also were easier targets, said Sanjay Aurora, Darktrace’s Asia-Pacific managing director. He confirmed there had been a plethora of attacks this year where adversaries focused on the supply chains of their main targets, since these companies would typically be guarded like a fortress. Hackers’ ultimate aim here was data exfiltration and would hunt for weak links along the supply chain, where a supplier had failed to keep up with patches, to breach the network and illegally access data of their main target, Aurora said.

    He advocated the use of artificial intelligence (AI) to better combat such attacks as well as ransomware, which was the leading threat vector. Coupled with self-learning capabilities, AI-powered security tools could autonomously identify vulnerabilities and changes in patterns, and predict and respond to malicious attacks, he said. This would be critical for industrial environments and operational technology (OT) systems, where the same AI approach–of identifying unusual movements across the network–could be applied without the need to change or swop out old systems, he said. According to Aurora, Darktrace’s AI system autonomously performed more than 150,000 investigations each week and responded to a security threat every six seconds. Reed noted that most common cause behind a breach still was someone clicking on a phishing link or malware. Adding that it was difficult to train people and full-proof the organisation, he said AI and machine learning would plug the gaps. And the threat landscape would only get more complex as more companies digitalised and adopted cloud, and with the emergence of 5G networks. Aurora said: “When you can’t even define what a network is [and] how to protect it, the only way to do so is to insert AI to wherever your data, digital asset, and remove workforce is. It’s a digital estate that now has more complexities and we can use probes, sensors, and native-cloud AI machines to process all the information real-time to get full view of what’s going on.” Stapleton said: “Our perimeter extends far beyond our networks. And now you’re talking about a remote workforce, which pushes everyone outside of the network. Third parties should be looked at as extension of our security [strategy], but I don’t think most of us are there yet. That’s the blackhole I’m seeing.” Check Point’s research head Lotem Finkelstein added that there was no longer any distinction between private and corporate networks, with employees including him working from home on the same network on which their family members also were connected. “In past decades, we’ve invested in protecting corporate networks, but in just the last year, we’ve opened many doors to different networks,” Finkelstein said. “IoT (Internet of Things) and 5G also have allowed us to work from anywhere with high speed, which means we may see more employees working from abroad across multiple locations.” This then would require a completely new security framework, where prohibiting someone living in another country from accessing the corporate network in Singapore, for instance, would no longer be feasible. “Five years from now, this won’t be possible because employees will be able to live and work from anywhere and will need access to the corporate network,” he said. “We will need to change the strategic thinking behind securing the network based on localisation, to allow people to access data securely and enable the employee’s ecosystem to protect itself.” RELATED COVERAGE More

  • in

    First multi-node quantum network paves the way for the quantum internet

     Three nodes that can store and process quantum bits were linked to create the world’s first rudimentary quantum network.   
    Image: Marieke de Lorijn for QuTech
    Researchers in the Netherlands have successfully connected three separate quantum processors in what is effectively the world’s first multi-node quantum network. This paves the way for a large-scale quantum internet that governments and scientists have been dreaming up for decades.QuTech, a quantum research institute based in Delft, has published new work in which three nodes that can store and process quantum bits (also called qubits) were linked. This, according to the QuTech researchers, is the world’s first rudimentary quantum network. 

    Connecting quantum devices is by no means a novelty: many researchers around the world are currently working on similar networks, but so far have only succeeded in linking two quantum processors. Establishing a multi-node connection, therefore, is a key step towards significantly expanding the size of the network. Driving much of the research effort is the objective of creating a quantum internet that could one day stretch across the surface of the planet. The quantum internet would exploit the strange laws of quantum mechanics to let quantum devices communicate with each other, and is expected to unlock a range of applications that cannot be run with existing classical means.For example, the quantum internet could link together small quantum devices to create a large quantum cluster with more compute power than the most sophisticated classical supercomputers.”A quantum internet will open up a range of novel applications, from un-hackable communication and cloud computing with complete user privacy to high-precision time-keeping,” said Matteo Pompili, a  member of QuTech’s research team. “And like with the Internet 40 years ago, there are probably many applications we cannot foresee right now.”One of the key quantum properties that underpins the quantum internet is entanglement — a phenomenon that occurs when two quantum particles are coupled in such a way that they become fundamentally connected, no matter how physically distant they are from each other.

    When two quantum particles are entangled, their properties become linked, which means that any change to one of the particles will inevitably be reflected in the other one. In quantum communications, this means that scientists could effectively use entangled particles to ‘teleport’ information from one qubit to its coupled pair, even if the two are in separate quantum devices.For the system to hold up, however, entanglement must be established and maintained in the first place. In the past decade, this has been achieved by numerous research groups, typically by creating a physical link between two quantum devices. Through this link, often optical fiber, qubits can be created, entangled and then distributed between two separate quantum devices.But two nodes are hardly enough to create a large-scale network; and in a fiber optic cable, for example, entanglement cannot be maintained after about 100 kilometers, meaning that the quantum networks set up so far have been limited by the short distance they can cover. This is why QuTech’s research team has been developing a system based on intermediate nodes, similar to routers in the classical internet, which could maintain entanglement over larger distances.Bob, Alice and CharlieThe architecture that the scientists have revealed is seemingly straightforward. A middle node, called Bob, has a physical connection to two outer nodes, called Alice and Charlie. This means that entanglement can be established between Bob and each of the outer nodes.Bob is equipped with two qubits, one of which is a memory qubit that allows the device to store an established quantum link, for example with Alice, while creating, thanks to its communication qubit, a new link with the other node — in this scenario, with Charlie. 

    Once both links with the outer nodes are created, Bob connects its own two qubits locally, which creates a fully connected network with entanglement between all three nodes. This means that a quantum link can be established between Alice and Charlie, even without a direct physical link between the two nodes.QuTech’s team also developed a first quantum network protocol, with a flag signal indicating that each operation has been completed successfully. “The main advantage of this demonstration is that we have a scalable way of linking multiple nodes into a network,” Ronald Hanson, who led the research team, tells ZDNet. “We have a memory that can store entangled state while the new entanglement is being prepared. And we have heralding signals that tell us when entanglement was successfully created.””This enabled us to make entanglement between the three nodes that is ready to be used for further processing or other protocols. This is the first time this has been achieved in any quantum network setting.”The new network will provide a testbed to develop new quantum internet hardware, software and protocols; but the experiment will also have to evolve from a proof-of-concept into a workable solution in order to scale up quantum networks.In effect, the researchers have so far ‘only’ connected single, separate qubits, rather than quantum processors. They will now focus on adding more qubits to their three-node network, and on adding higher-level software and hardware layers. But in the future, the team expects the current approach to be tested outside the lab on existing telecom fiber. “The future quantum internet will consist of countless quantum devices and intermediate nodes,” says Hanson. ‘Colleagues at QuTech are already looking into future compatibility with existing data infrastructures.”QuTech’s research is supported by the EU’s Quantum Internet Alliance, which is part of the bloc’s decade-long, €1 bil­lion ($1.2 billion) Quantum Flagship — an initiative launched in 2018 to boost quantum research and development.The EU is far from alone in promoting the development of the quantum internet. China and the US are equally interested in advancing quantum networks, and have already achieved milestones in the field. Chinese scientists, for example, recently established entanglement over a record-breaking 1,200 kilometers.Earlier this year, scientists from Cleland Labs in the US also succeeded for the first time in entangling two separate qubits by connecting them via a cable, another breakthrough that’s expected to accelerate the creation of quantum networks.

    Quantum Computing More

  • in

    Government picks 81 regional sites to fund through AU$90 million program

    Image: Asha Barbaschow/ZDNet
    The Deputy Prime Minister of Australia, Michael McCormack, alongside Communications Minister Paul Fletcher and Minister for Regional Health, Regional Communications and Local Government Mark Coulton announced on Friday that 81 sites have been selected to carve up the AU$90 million available under the Regional Connectivity Program. Thanks to co-funding arrangements with recipients of the grants, state and local-level governments, regional businesses, and community organisations, the total spent will be in excess of AU$180 million. However, details on the successful projects are scant. Besides wireless networks for the Gundagai, Cootamundra, and Snowy Valleys areas in New South Wales, the rest are a mystery and will be announced in coming weeks. “Grants have been allocated on a competitive basis, with the value of successful projects ranging from AU$80,500 for targeted mobile capacity upgrades in small towns to AU$8,750,000 for the deployment of large-scale fixed wireless broadband networks across entire regions,” the trio said. The lack of detail left nothing but flowery hand waving for the ministers concerned. “From Gippsland to the WA Grainbelt, the Regional Connectivity Program will provide targeted upgrades to connectivity in regional areas that need it the most, ensuring that more Australians can access high-speed, reliable broadband and mobile services,” Fletcher said. The program, previously pinned at AU$60 million available, formed part of the government’s response to the 2018 Regional Telecommunications Review.

    Also on Friday, Fletcher announced in more concrete terms that the government has appointed Andrew Dix to the board of NBN for three years. Dix currently is the chair of the audit and risk committee for the Bureau of Meteorology as well as Services Australia, and is a board member of Swinburne University and the Victorian Farmers Federation. “Dix is a chartered accountant and former Telstra executive with considerable expertise to complement the current NBN Co Board, including his financial management skills and his experience in the telecommunications sector,” Fletcher said. Related Coverage More

  • in

    Aussie Broadband outage takes down telco's own site and lines

    Image: Getty
    A lunchtime outage has hit Victorian customers of Aussie Broadband, as well as the telco’s own infrastructure and website. “We are currently experiencing a Victorian wide outage. Our website and our phone line is also affected. We will have more details soon. Thanks so much for your patience,” the company tweeted at 1:35pm AEST. Half an hour later, the company said it saw systems coming back online, and its site was reachable again. It suggested customers restart their routers if not yet reconnected.By 3:21pm AEST, the company said its system were fully back online.Earlier this week the telco announced its move into the white label market, allowing others to sell its NBN, Opticomm, and VoIP services. The company already has one customer shifting 25,000 services onto its network in the 2022 financial year, but has chosen not to name them. The company on Tuesday also updated its number of connections as of the end of March. Residential broadband connections now sit at 340,000, with 33,500 business broadband customers. Overall, the company added 30,400 services to its network in the first quarter of the calendar year. Related Coverage More

  • in

    Huawei looks to diversify product focus, confident against Chinese cloud players

    Huawei Technologies will continue to diversify its product focus as it looks to buffer a decline in its smartphone sales, with its other connected devices including laptops and smart TVs seeing strong growth this past year. The Chinese tech giant also believes its wide product portfolio will stack well against its local peers, such as Alibaba and Tencent–all of which are looking to grow their footprint in the Southeast Asian region. Huawei reported sluggish performance in its recent earnings report, where its annual operating profit fell for the first time in over five years to 72.5 billion yuan ($11.09 billion) in 2020. China also was the only region it saw revenue climb by 15.4% to 585 billion yuan, with all other regions, which included Asia-Pacific, EMEA, and Americas, seeing dips in revenue of between 8.7% and almost 25%. Huawei attributed the loss to a dip in its smartphone sales, which were impacted by ongoing US export sanctions that blocked access to Google’s Android ecosystem. US export bans also cut the Chinese vendor’s access to core chipsets, which Huawei said disrupted its supply chain. 

    It pushed the vendor to diversify its chip suppliers as well as its product focus. At its earnings briefing, Huawei’s rotating chairman Ken Hu said the vendor would look to drive focus on the company’s other connected devices, such as smart TVs, laptops, and smart watches.  Pointing to the company’s “1+8+N” strategy, in which “8” referred to its range of connected devices, Hu said revenue from these eight devices had buffered the impact from a dip in its smartphone sales.In fact, its “8+N” business had clocked a 65% year-on-year increase in sales last year, chalking up 891.4 billion yuan ($136.38 billion) in revenue. “N” comprised third-party Internet of Things (IoT) devices that connected via Huawei’s HiLink platform and file-sharing technologies. “1” referred to Huawei’s smartphone products.Hu said the vendor would be working to introduce more hardware products, software and services, as it looked to build an ecosystem that extended beyond its smartphone. 

    In an interview with ZDNet, Huawei’s Asia-Pacific president Jay Chen was unable to provide any update on the US sanctions, but noted that these had far-reaching impact on trust across the entire global value chain. Chen noted that revenue and market share of US companies also would be adversely affected in the long-term. In spite of the pressures on its chip supply, he said Huawei would continue to introduce new handsets and look to maintain its market position. He reiterated the company’s aim to diversify its chip partners and supply chain. And despite the challenging past year, the vendor remained bullish about its growth potential across the wider Asia-Pacific, outside of China. Chen cited accelerated digital transformation efforts in the region as a key driver and significant growth potential. The population was also big and digital acceptance high, which placed the region as an important and strategic market for Huawei, he said. In fact, outside of China, Asia-Pacific was the fastest-growing region for Huawei’s cloud business, according to Hunter Shao, Huawei’s Asia-Pacific vice president of industry development, who noted that the vendor saw its revenue climb three-fold year-on-year. Shao added that it was targeting to be amongst the top three cloud providers worldwide within the next three years. He said Huawei offered a wide range of products and services that spanned devices, edge computing, network equipment, and software, which bolstered its cloud gameplay. The vendor also had long history in Asia-Pacific that stretched two decades, during which it served local carriers and telecommunications service providers across the various markets. It also worked with enterprise customers that tapped its infrastructure portfolio for more than 10 years. Asked how it was working to address security concerns that continued to persist today, Chen stressed that security was a key consideration across Huawei’s entire product range, whether it was cloud or 5G equipment. It also was embedded in all its internal processes and product design, he said.He noted that the vendor adhered to industry standards with regards to network security, which was critical to establish trust. He pointed to GSMA’s Network Equipment Security Assurance Scheme (NESAS) as one such standard that market players and stakeholders should adopt. NESAS is a voluntary initiative introduced to provide a security enhancement program that focused on mobile network infrastructure equipment. It encompasses equipment designed to facilitate functions defined by 3GPP (3rd Generation Partnership Project), and deployed by mobile network operators on their networks. Specifically, it comprises security assessments of vendor development and product lifecycle processes as well as security evaluations of network products. Chen encouraged global participation of NESAS or it would be difficult to resolve any questions about network security. He also urged for geopolitics to be left out of discussions concerning network security. He added that Huawei was the first vendor globally that was willing to sign an agreement stating there was no backdoor in its equipment. To further tap cloud potential in the region, Shao said Huawei would continue to work with partners to deliver services across various sectors including smart cities, unmanned stores, and autonomous vehicles. The vendor also would further build out its cloud coverage in Asia-Pacific, where it currently had four local POPs (points of presence) including Singapore and Hong Kong. Asked about competition from its Chinese peers such as Xiaomi, Tencent, and Alibaba–the latter two of which were also eyeing cloud growth in this region–Chen again pointed to Huawei’s diverse product portfolio that included software and services as well as its “strong hardware DNA” as key competitive advantages.  He added that the vendor had built up a “very mature ecosystem” in its international business over the last two decades, with teams in every local market. “So we’re confident against the competition and believe we can play well,” he said. RELATED COVERAGE More

  • in

    BoM floats idea of Antarctic subsea cable and satellite upgrades

    Image: Getty Images
    The Australian Bureau of Meteorology (BoM) has pulled out its ultimate wishlist, and asked for one of everything by floating the idea of running a subsea data cable to Antarctica and improving satellite connectivity to its weather stations. Writing in a submission to the Joint Standing Committee on the National Capital and External Territories’ Availability and access to enabling communications infrastructure in Australia’s external territories inquiry, the BoM called for fibre to be laid between Australia’s Antarctic research stations of Davis, Casey, Mawson, and Macquarie Island. “An intercontinental submarine fibre optic cable from Australia to the Antarctic continent would establish a reliable, high bandwidth, low latency communication service to Australian research stations for the next 25 years and beyond as a long-term communications plan,” it said. “Establishing an intercontinental submarine cable to Antarctica may be beneficial to Australian interests, and better ensure safe and secure operations in the territory by diversifying the communication infrastructure used to operate the Bureau’s Antarctic meteorological services and allow for the expansion of services and capabilities across the vast continent.” A submission by the Department of Agriculture, Water and the Environment provided an idea of the ambition of the BoM’s request, an idea which the department endorsed. “The Australian Antarctic Division is headquartered in Hobart, Tasmania. The distance from Hobart to the four research stations is 3443, 4838, 5475 and 1542 kilometres respectively,” the department said. Currently, the Antarctic Division uses C-band satellite connections from Speedcast at each of the four stations, which are capable of 9Mbps and have 300ms of latency. Each station also has a backup data link from the Inmarsat Broadband Global Area Network, which provide a mere 0.65Mbps link with latency of 700 milliseconds.

    “The capacity of a fibre cable would be in the order of tens to hundreds of terabits per second, with an individual connection having speeds in the ten to hundred gigabit per second range,” the department wrote. “Currently, there are no submarine fibre cable connections to the Antarctic continent, and such a connection would provide unprecedented speed and reliability, and would establish Australia as a key leader and international partner in the Antarctic.” However, the Antarctic environment poses some challenges, mainly in the form of icebergs. “Approaches to shore would need to be carefully considered, as well as mitigation options and impacts if the cable connection were to be interrupted, especially if medical or safety systems evolve to rely on increased communications capability,” it said. “In situations where an approach to shore would be prevented by icebergs, intracontinental wireless communication would need to be developed.” Beyond cables, the BoM also floated the idea of improving its satellite connectivity options, including the launch of two geostationary satellites capable of connectivity and carrying out high-resolution weather observations which rely on Japanese satellite presently. The BoM explained its remit covers 53 million square kilometres, borders 10 countries, and covers large parts of the Indian, Pacific, and Southern Oceans, as well as the Australian continent and its Antarctic territories. “Today, the Bureau is reliant on commercial communication satellite providers to access a majority of our remote sites as well as the Antarctic research stations,” it said. “This presents a risk of general satellite technology failures from environmental space weather or other types of interference and a lack of sovereign control of the platforms. This could be mitigated by installation of an Australian Government satellite capability that serves our geographic areas of interest.” Last month, the Bureau similarly told the committee looking into developing Australia’s space industry that the nation needed a sovereign satellite capability. Around 20% of BoM sites on remote islands use 3G connectivity as a primary link, with low bandwidth satellite backup. The Bureau is looking at whether to use 4G and 5G connections instead, and using NBN satellites. “One of the critical concerns for the Bureau is network latency. Issues with NBN satellite connections diminishing the throughput and transmission of information to the Bureau can result in delayed radar data processing and output to the operations centre and the Bureau’s mobile weather app,” it wrote. “Additionally, impacts on telecommunications and video conferencing from network latency can adversely affect access to data products, integration with the operations centre, and staff communications at remote sites.” The Bureau said it would look at low Earth orbit satellite services like SpaceX Starlink once they become available. In an earlier submission, Space X said it could offer services to Australia’s external territories as early as 2022. Building on previous comments from those on Norfolk Island calling for a cable connection, the Norfolk Island Central School said under a deal signed with Telstra, it should be allocated 5Mbps per student. “Norfolk Island Central School should have an internet connection of approximately 1500Mbps,” it said. “To achieve that using the current, limited connection availability, we would need approximately 60 separate NBN SkyMuster Plus connections at school — including the 60 satellite dishes — and even then, the bandwidth is not guaranteed.” Related Coverage More

  • in

    Chorus customers on 1Gbps connections inch closer to 150,000 mark

    Image: Chorus
    New Zealand broadband wholesaler Chorus said it now has 143,000 users on 1Gbps connections, after uptake grew by 7,000 connections during the third quarter.For the three months to the end of March, the company added 29,000 customers across all fibre connections, and said the average monthly data use of fibre increased from 460GB to 491GB. Overall, Chorus saw its amount of broadband connections decline by 2,000 to 1.181 million, with the company blaming it on COVID-19 and the impact it had on network migration and population growth, which it said led to broadband growth being constrained. This, however, was counterbalanced by the return of students from holidays, which Chorus said helped restore some prior period disconnections. In terms of data usage, monthly data use grew slightly from 390GB recorded last quarter to 416GB in March. Broken down, it was 241GB for copper-based connections and 491GB for those on fibre. Chorus also reported total fixed line connections continued to decline, reaching 1.356 million after losing 13,000 connections during the third quarter. Similarly, copper broadband and voice connections also dropped by 42,000.On Wednesday, New Zealand’s competition watchdog Commerce Commission (ComCom) released its 2021 summer edition of its Measuring Broadband New Zealand report [PDF] that showed the performance of Fibre Max plans have improved substantially.The report shows the average download speed of Fibre Max plans has increased by more than 200 Mbps, or 35%, to around 840Mbps. It attributed the improvement to network changes that were made by local wholesalers and retailers towards the end of last year. When the changes to network configurations were made, Fibre Max speeds reached around 940Mbps based on tests carried out by SamKnows.

    “Overall performance is now in line with advertised speeds. There is no noticeable dip in performance during peak hours and performance differences across the country have been smoothed,” said Telecommunications Commissioner Tristan Gilbertson.Fixed wireless technology, however, continues to experience “significant” latency, the report showed.”While 4G fixed wireless can offer higher download speeds than copper broadband, it is not comparable to fibre in its performance in real-time applications. Higher latency can have a real impact on consumers who use the internet for video calling, online gaming and watching high-definition video,” Gilberton said.Meanwhile, 2degrees has announced it has selected Ericsson to help build its 5G network at 700 sites, with build set to begin in Auckland and Wellington. The partnership comes after New Zealand banned the use of 5G equipment from 2degrees’s longstanding partner Huawei. Off the back of the ban, Huawei New Zealand saw revenue dip nearly NZ$200 million year on year to NZ$111 million for the period ending 31 December 2020, as reported by New Zealand Reseller News.Huawei also fell into the red by NZ$713,424, down from net profit of NZ$4.9 million in 2019.MORE FROM OVER THE DITCH More

  • in

    What's the most popular web browser in 2021?

    I literally wrote the first popular article about the web. Since then I’ve been keeping a close eye on web browsers, as our only choice was the WEB shell program. We’ve come a long way, but web browsers are still the primary way we connect with the endless fields of data, stories, and video that makes up the modern web. And, today, Google’s Chrome is the way most of us work and play on the web.It’s been really hard to get hard data on which were really the most popular web browsers. True, many companies claimed to have good information, such as NetMarketShare and StatCounter, but their numbers are massaged. The US federal government’s Digital Analytics Program (DAP), however, gives us a running count of the last 90 days of US government website visits. That doesn’t tell us much about global web browser use; it’s the best information we have about American web browser users today.
    And the top web browser is, according to the DAP’s 6.67 billion visits over the past 90 days (drumroll, please): Google Chrome with 48.3%. That’s a smidgen down from last year when Chrome had 49.3%.  This drop didn’t come from any sudden rise of an alternative browser. Perish the thought. On the desktop, Chrome rules. But, in the last 12 months, we’ve seen an enormous rise of smartphones over PCs for web use. In 2019 and 2020, just over half — 50% to 46.9% — of the web browsing market belonged to smartphones over PCs. The remainder, 3.1%, went to tablets. In 2020 and 2021, 57.4% of web browsing sessions were on smartphones with only 40.5% on laptops and desktops. The tablet market shrunk down to 2.1%.

    As for the smartphone market, there the top browser remains Safari. Macs continue to hang on with 9.5% of the PC market, but with 34.6%, iPhones dominate both the smartphone and smartphone browser markets. The only other browsers that matter on smartphones, besides Safari and Chrome, are Samsung’s built-in Samsung Internet with 2.6% and the generic Android Webview. In the US, it’s become clear iPhones are quickly gaining market share. Last year, only 29.5% of smartphones used were iPhones, with 23% using Android-based smartphones. Now, with 34.6% to Android’s 24.5%, iPhones are more popular than ever.Getting back to web browsers, Chrome is even bigger than it looks at a glance. Its open-source foundation, Chromium, is also what Microsoft Edge runs. Edge, with 5% of the user base, is now the third-place web browser. Except for Mozilla Firefox, all the other web browsers that matter, such as Opera, Vivaldi, and Brave, run on top of Chromium.

    Firefox is in fourth place and doing, in a word, “badly.” In the last 12 months, Firefox dropped to 2.7% from last year’s 3.6%. In 2015, when I first started using DAP’s numbers, Firefox had an 11% market share. By 2016, Firefox had declined to 8.2%. Firefox has a slight bounce upward by 2018 to 9%. Despite its ad deals with Google, Mozilla has been laying off more employees. You really must wonder how long Firefox is going to matter at all.At the bottom of the list, comes the long dying Internet Explorer (IE) with 2.2%. Even though Microsoft has urged users to dump IE in favor of Edge for over a year, some users are still sticking with this hopelessly out-of-date browser. The most popular version is the still supported IE 11, with 1.9%. The antique IE 7, which hasn’t been supported in over four years, is still hanging around with 0.2% of the market.In short, today’s internet belongs to Chrome on the desktop and Safari on smartphones. Nothing else really matters.Related Stories: More