More stories

  • in

    Brazil announces partnership with Elon Musk to connect Amazon rainforest

    Written by

    Angelica Mari, Contributing Editor

    Angelica Mari
    Contributing Editor

    Angelica Mari is a Brazil-based technology journalist. She started working at age 15 as a computer instructor and started writing professionally about technology two years later.

    Full Bio

    The Brazilian government has announced a partnership with Elon Musk’s company Starlink for the operation of satellites in the Amazon rainforest. The plan was announced on 20th May, during Musk’s visit to Brazil, where he met president Jair Bolsonaro and five ministers, as well as 10 local businessmen at a luxury hotel in the countryside of São Paulo.

    At the end of the event, Bolsonaro and Communications minister Fabio Faria spoke briefly to the press, describing how Starlink satellites, which should go live over the Amazon in the coming months, could be helpful in terms of providing broadband to schools and monitoring fires and logging in the rainforest. Musk said on Twitter that Starlink would connect 19,000 schools in rural areas. In a press release, the Ministry of Communications replicated the contents of the tweet, adding that technical and specific investment details around the partnership will be “discussed at a later date, with the public and private sector stakeholders involved,” No contracts have been signed at the occasion. The partnership follows the February announcement from telecommunications agency Anatel that it had granted Starlink, a company owned by spaceflight firm SpaceX, the right to operate in Brazil, with exploitation rights running until 2027. The agency considered granting the rights until 2033 but shortened the time span of the authorization given the venture’s “pioneering nature” and “possible unforeseen impacts.”At the time of the Anatel announcement, the company said it had plans to put 4,408 satellites into orbit as part of its plans to build an interconnected satellite-based internet network. The company will not have the right to protection and cannot cause service interference with other satellite systems. Prior to the latest developments, Musk, SpaceX president Gwynne Shotwell and minister Faria had met in November 2021 during a technical mission to the US to discuss the projects around connectivity and use of next-generation technology in the Amazon.Environmental issuesBolsonaro has been criticized for his administration’s handling of environmental matters in Brazil. Deforestation in the Brazilian Amazon rose 64% between January and March 2022 compared to the same period a year ago, according to the national space agency Inpe. The head of state stressed that Musk’s visit to Brazil is a “milestone” for the country and that technology will show the “truth” about how the Amazon is preserved. “Of course, there are niches for fires and irregular deforestation. But the arrival of satellites will help us preserve [the rainforest]”, he said.”Now, we also need to develop that region, which is very rich in biodiversity and mineral wealth,” the president added. Bolsonaro offered the opportunity to exploit niobium reserves in Brazil to the billionaire, and noted studies are underway to add graphene to create a super battery. However, Brazil’s lithium reserve, which is the world’s seventh-largest, is more interesting to Musk since his company Tesla is shifting to lithium iron phosphate batteries for its electric vehicles. Elected through social networks in 2018, Bolsonaro described Musk as a freedom crusader and that the businessman’s intention to buy Twitter for $44 billion announced in April was a “breath of hope”.

    Networking More

  • in

    NASA is investigating this 'mystery' data coming from Voyager 1

    Credit: NASA/JPL-Caltech
    NASA scientists are trying to make sense of buggy system data the interstellar Voyager spacecraft is transmitting from about 20 light-hours away from Earth, some 45 years after it launched. Voyager 1 launched in September 1977 and is now the farthest spacecraft from Earth, traveling in space at about 14.5 billion miles (23.3 billion kilometers) away. It would take light about 20 hours to travel from the spacecraft. 

    NASA’s Jet Propulsion Laboratory is investigating a glitch in the system data the interstellar explorer is collecting. Readouts from the probe’s attitude articulation and control system (AACS) “don’t reflect what’s actually happening onboard”, according to JPL.SEE: NASA’s Mars lander is running out of power. Here’s what happens nextEverything about the AACS suggests it is functioning normally, yet the telemetry data it’s sending back to Earth is “invalid”, producing what appears to be randomly generated data that doesn’t match any possible state the system could be in. The AACS controls the Voyager 1’s orientation and keeps its high-gain antenna trained on Earth for optimal data transmission. The spacecraft would be put into a functionally reduced ‘safe mode’ if Its onboard fault-protection systems had been triggered. Its signal remains strong but the data appears to be malformed, according to NASA JPL.    NASA considers anomalies like this to be normal for a spacecraft of its age. “A mystery like this is sort of par for the course at this stage of the Voyager mission,” Suzanne Dodd, project manager for Voyager 1 and 2 at NASA’s Jet Propulsion Laboratory in Southern California, said in a statement.  “The spacecraft are both almost 45 years old, which is far beyond what the mission planners anticipated. We’re also in interstellar space – a high-radiation environment that no spacecraft have flown in before. So there are some big challenges for the engineering team. But I think if there’s a way to solve this issue with the AACS, our team will find it.”Dodd said the team could just “adapt” to the glitch if they can’t identify the source of it. If the source is found, it could be fixed by a software update or via one of the spacecraft’s redundant hardware systems.SEE: NASA’s Mars helicopter just took these remarkable photos of the rover’s landing gearVoyager 1 was launched from Cape Canaveral after the Voyager 2 took off, but because of its faster route, it overtook its twin to fly by Jupiter in 1979 and then Saturn in 1980, according to NASA. It’s also gone further than Voyager 2, which is currently traveling about 12.1 billion miles from Earth.The Voyager 1 was the first human-made object to reach into interstellar space and in 1998 overtook NASA’s Pioneer 10 to become the most distant human-made object. It reached interstellar space in August 2012 and, among other things, takes measurements of the density of material in interstellar space. It will eventually exit the solar system but not for a long, long time.”If we define our solar system as the Sun and everything that primarily orbits the Sun, Voyager 1 will remain within the confines of the solar system until it emerges from the Oort cloud in another 14,000 to 28,000 years,” NASA notes. Both Voyagers carry a message on a gold-plated copper disc in case extraterrestrials find the spacecraft one day. It also includes a player and instructions describing how to play the content. The disc includes greetings in 55 languages and 90 minutes of mostly Western music. 

    Innovation More

  • in

    How to use the Opera VPN (and why you should)

    Written by

    Jack Wallen, Contributing Writer

    Jack Wallen
    Contributing Writer

    Jack Wallen is what happens when a Gen Xer mind-melds with present-day snark. Jack is a seeker of truth and a writer of words with a quantum mechanical pencil and a disjointed beat of sound and soul.

    Full Bio

    on May 17, 2022

    | Topic: VPN

    Once upon a time, VPNs were pieces of technology that made it possible for you to work remotely and still have access to internal files and directories (as if you were local). VPNs of today serve a much different purpose. What modern VPNs do is mask your IP address and encrypt your data.This is absolutely crucial for some users and use cases. Consider you’re working on a public wireless network and you have to transmit sensitive data over a network and you’re not exactly certain how secure that network is. What do you do? Do you just go ahead and risk transmitting that data as you normally would? 

    Not if security and privacy are important. If that’s the case, a VPN will be your best friend. Why you should be using a VPNAs I said earlier, a VPN not only masks your location but also encrypts the data you send from your browser. That’s an important distinction, as the Opera VPN only works within the browser. This isn’t a global VPN that masks and encrypts all data leaving either a computer or mobile device. For that, you would have to make use of another service. But given the majority of users do the majority of their work within a browser, a built-in VPN is a great option. But why should you care about masking your IP address or location? This is simple — privacy. If someone intercepts unencrypted non-anonymized data from your computer or mobile device, they could locate you. When you use a VPN, your location can be masked to look like it’s in a completely different country. Couple that with the data encryption and the big question should be, “Why have you put off using a VPN for this long?”With that said, I want to show you how to use the Opera VPN on both the mobile and desktop versions. I’ll be demonstrating this on the Android and Linux versions of the browser, but the process should be similar, regardless of what platform you use.Using the VPN on Opera mobileLet’s first take a look at how to enable the VPN on Opera mobile. To do this, open Opera on your device. From the Opera main window (Figure 1), tap the profile icon at the bottom right of the display.The Opera mobile main window as seen on Android 12.In the resulting popup, tap the gear icon in the upper left corner. You should then see the listing for the VPN (Figure 2).The VPN is currently disabled.Tap the ON/OFF slider until it’s in the ON position. And now, everything you transmit from within the Opera browser is anonymized and encrypted.Using the VPN on Opera desktopTo enable the VPN on Opera desktop, you need to click the Opera icon in the top left corner and then click Settings. In the left navigation, click Privacy & security, where you’ll see the entry for Enable VPN (Figure 3).Enabling the VPN on Opera desktop running on Pop!_OS Linux.Click the ON/OFF slider until it’s in the ON position, which will place a small VPN icon to the left of the address bar (Figure 4).With the VPN icon showing, you know the VPN is on.Testing the VPN connectionThere’s a simple way to test if the VPN connection is working. First, turn off the VPN and go to whatismyipaddress.com. The results should not only show your current IP address, but also the location of your IP address. Next, turn on the VPN and go back to the same site. You should see both a different IP address and location. With the Opera VPN off, my connection was listed correctly. With the VPN on, my connection was listed in Colima Mexico. Success!And that’s all there is to using the Opera VPN on both the mobile and desktop versions. If you value your security and privacy, you should seriously consider making use of this feature. More

  • in

    AMD, Qualcomm to offer Wi-Fi 6 and 6E, and secure Wi-Fi remote management

    Written by

    Adrian Kingsley-Hughes, Contributor

    Adrian Kingsley-Hughes
    Contributor

    Adrian Kingsley-Hughes is an internationally published technology author who has devoted over two decades to helping users get the most from technology — whether that be by learning to program, building a PC from a pile of parts, or helping them get the most from their new MP3 player or digital camera. Adrian has authored/co-authored technical books on a variety of topics, ranging from programming to building and maintaining PCs.

    Full Bio

    AMD and Qualcomm have been in collaboration to optimize the FastConnect 6900 wireless connectivity for the Ryzen PRO line of processors aimed at business laptops.By using the 6GHz wireless band, FastConnect can improve video conferencing, reduce latency, and enhance connection reliability by using multiple Wi-Fi bands.

    But FastConnect offers more. IT administrators can now leverage the AMD Manageability Processor and make use of FastConnect’s support for almost three dozen of the most widely used Open Standard-Based (DASH) profiles to carry out remote management on AMD commercial platforms.This is a fantastic built-in ready-to-use solution for enterprise customers where hybrid working is now a big part of what IT admins have to deal with.”Out-of-band Wi-Fi remote management is an important tool for enterprise IT managers to diagnose and fix issues, even when the operating system is not running,” said Jason Banta, CVP and General Manager, OEM Client Computing AMD. “AMD Ryzen PRO 6000 Series processors with Qualcomm FastConnect 6900 enable next-generation business laptops to have the processing and connectivity tools needed to perform in modern environments, offering professional-strength remote manageability for users in the new, hybrid workplace.”The first chips to offer FastConnect will be the AMD Ryzen PRO 6000 Series processors, and these will be found in systems such as the Lenovo ThinkPad Z Series and HP EliteBook 805 Series.Along with FastConnect, these chips bring with them the power, performance, and great battery life — the features that business laptop users need.”Our collaboration with AMD reflects Qualcomm Technologies’ commitment to the mobile computing space. By optimizing FastConnect 6900 for platforms powered by AMD Ryzen 6000 Series processors, we’re bringing secure Wi-Fi remote management to AMD enterprise customers,” said Dino Bekis, vice president and general manager, Mobile Compute and Connectivity, Qualcomm Technologies, Inc. “This represents the first step in our relationship to bring superior wireless connectivity to the AMD mobile computing roadmap.”

    Processors More

  • in

    Ethernet creator Metcalfe: Web3 will have all kinds of 'network effects'

    Written by

    Tiernan Ray, Contributing Writer

    Tiernan Ray
    Contributing Writer

    Tiernan Ray has been covering technology and business for 27 years. He was most recently technology editor for Barron’s where he wrote daily market coverage for the Tech Trader blog and wrote the weekly print column of that name.

    Full Bio

    “For the first time, I am trying to say exactly what kinds of value are created by networks,” Bob Metcalfe, inventor of Ethernet, told a small group during a soiree on the sidelines of The Knowledge Graph conference. He predicts decentralized knowledge graphs, which marry knowledge graph databases with connectivity, will create new forms of value.
    Arnold Safanova
    When Bob Metcalfe was selling Ethernet to the world as a new networking technology in the 1980s at 3Com Corp., he had a clever sales pitch: You’ll get more value out of the product the more of it you buy.What was a cheeky pitch hid a deeper element of truth: Networks are more valuable the more things they connect. Later, Metcalfe refined what he was talking about, formulating what was called “Metcalfe’s Law.” The law says, The value of the network increases as the square of the number of entities taking part in the network, where entities could be computers, but also humans, as in the case of Facebook. The value is squared because that’s the number of connections that can be formed.

    Things that get better in this way, said Metcalfe, have what he has christened “network effects,” a kind of centripetal force where more and more participants induce even more participation, in a virtuous cycle. Facebook shows that: The more people join, the more other people are inclined to join. Metcalfe is still refining his pitch for his Law and learning at the same time. “There are going to be all kinds of network effects in Web3,” said Metcalfe, during an informal gathering in Williamsburg, Brooklyn, on the sidelines of The Knowledge Graph conference, a conference where enthusiasts of knowledge graphs share technology and techniques and best practices. “For the first time, I am trying to say exactly what kinds of value are created by networks,” Metcalfe told ZDNet at the Williamsburg event. “What I have learned today is that knowledge graphs can go a lot farther if they are decentralized,” said Metcalfe. “The key is the connectivity.”Earlier in the day, Metcalfe had given a talk at the KGC main stage, “Network Effects in Web3.” In the talk, Metcalfe explained that “networks are valuable,” in many ways. They offer value as “collecting data,” said Metcalfe, the ability to get data from many participants. There was also sharing value, sharing disk drives, say, or sharing files. Netflix, said Metcalfe, has “distribution value — they distribute content and it’s valuable.”There will be new forms of value creation, Metcalfe believes, based on startups that combine knowledge graphs with connectivity. The event in Williamsburg was hosted by one such startup, OriginTrail, which was founded in 2013, is officially headquartered in Ljubljana, Slovenia, and has offices in Gibraltar and the US. Metcalfe is an advisor to OriginTrail. Metcalfe, left, at the KGC conference Tuesday with OriginTrail general manager Juri Skornik, center, and CTO Branimir Rakic.
    Arnold Safanova
    OriginTrail is creating what it calls the first “decentralized” knowledge graph, a knowledge graph whose nodes can be networked.The basic idea is that while “Layer 1” technologies of blockchains authenticate items, the “Layer 2” technology of the OriginTrail’s Distributed knowledge Graph lets you query and interact with things that have been authenticated. Everything that is unique has a “Universal Asset Locator,” or UAL, an analog to Web URLs. The UALs are meant to be compliant with the W3C’s spec for “decentralized identifiers.” The form is just like an HTTP address, preceded by the identifier tag “dkg://”, for distributed knowledge graph, with the address of the particular item following. Transactions can happen as people “publish” things on the Internet with a unique UAL — through a simple “create” statement — that is then recorded by the decentralized knowledge graph of nodes, currently a couple thousand. Everything that is published is a unique asset, a digital twin, so that it can stand for real-world objects, such as sneakers or whiskey. It can be sold to another party, who “takes control of the state of that graph,” as Rakic explains, by giving the person the NFT that has the UAL.The nodes each have graph databases that have pieces of the collective graph, and they each function in a permissionless, peer-to-peer fashion that is analogous to how blockchains function. Similar to blockchains, those who run nodes to verify published things are rewarded by the people who publish the things.OriginTrail’s knowledge graph relies on multiple Layer 1 blockchains, but the company is soon going to introduce its own blockchain, running as a function of the Polkadot blockchain.As co-founder and CTO Branimir Rakic explained Wednesday during a technical presentation, “blockchains are not good databases.” Blockchains can be queried, but only in a limited fashion, said Rakic. What’s needed, maintains Rakic, is a “semantic network” on top of blockchains. That’s what the company proposes with its distributed knowledge graph. By combining Tim Berners-Lee’s notion of “The Semantic Web” with Web3, said Rakic, you’ll get “The Semantic Web3.” “I like where it’s going,” said Metcalfe of OriginTrail’s approach. “All this stuff — DeFi, DOAs, crypto — all the decentralized stuff of Web3, it’s all going in this direction of sharing value,” said Metcalfe. Metcalfe told the group at the Williamsburg soiree that decentralized knowledge graphs will make possible a kind of eternal springtime for artificial intelligence.”AI was invented in about 1968, when I was a graduate student,” he said. “And for years, AI would rise and then it would fall, and it fell because AI ran out of data,” explained Metcalfe, “AI relies on data.””Well, it’s not going to fall, it’s going to continue to rise, because the decentralized knowledge graphs are going to give AI more and more data.”Metcalfe, who for a decade served as a judge of the startup competition at SouthbySouthwest, was asked by ZDNet how he rates OriginTrail’s chances of success as a company. “The weakness of it is that it’s too complicated to explain” to ordinary mortals, said Metcalfe of the technology. The OriginTrail technology appears a bit like middleware, which is a category that only tends to excite a handful of people. “Yes, and I’m one of them,” said Metcalfe.Despite the complexity of the tech, “What they are doing is right in line with where things are going.” More importantly, he took on the advisor role because he’s learning from what the company is doing, educating himself on what new forms of value there will be.The Knowledge Graph Conference is in its fourth year, having begun life as a small affair in a ballroom at Columbia University in 2019. This year, after two years of virtual-only proceedings, the conference has blossomed into a sprawling hybrid event, with dozens of panels as well as live sessions at the Cornell Tech campus on Roosevelt Island in New York City. The program runs through May 6th.

    ZDNet Recommends More

  • in

    Inexpensive Wi-Fi 6: Motorola MH7603 mesh router for the win

    A friend of mine recently told me that while he appreciated that I could actually use the incredible speed of the Netgear Orbi Wi-Fi 6E, he could never justify buying it. I get that. The top-end Orbi (see my review) is for people who must have the fastest possible Wi-Fi. For everyone else, there’s the much more affordable Motorola MH7603 mesh router.The three-unit MH7603 mesh router uses Wi-Fi 6, aka 802.11ax, to deliver 692 Mbps speeds in the same room. At a range of 10 yards and through a wall, it delivered an honest 287 Mbps. Jaw-dropping? No. Pretty darn good? You bet.

    LikePriceSpeedRange

    Don’t LikeNo WPA3 securityNot enough administration control for small business use

    To test it, I used Ixia’s IxChariot networking benchmark and my Galaxy S21 Ultra smartphone. This was backed up by my 1 Gigabit Charter cable internet connection. Now, in theory, this dual-band 2.4 and 5GHz AX1800 system can reach speeds of up to 574Mbps on 2.4GHz and up to 1,200Mbps on 5GHz. In practice, no one ever reaches those speeds on any Wi-Fi hardware. It supports most Wi-Fi 6 technologies, hence its speed. However, it doesn’t have WPA3 encryption or 160MHz channels support. For home users that’s not a big problem. The mesh network also has good range and penetration. That’s a must for me. I have both a historic home, with 3,000 square feet and the thick walls that come with an early 1900 house, and a modern 1,000 square foot office. The MH7603 can cover up to 5,000 square feet. The main router can cover 2,000 square feet, while each mesh node can handle 1,500 square feet. It took some positioning, but the Motorola unit was able to cover both buildings when I was done. If you don’t need that kind of coverage, you can buy a single router node for $129.99.

    Underneath the hood, there’s a 1.5GHz quad-core ARM CPU, 256MB of DDR3 RAM, and 128MB of flash memory. Each unit also has a pair of internal antennas.  Now, the MH7603 isn’t going to win any design awards from Jony Ive. They’re three identical white boxes, standing 2.6 inches tall and 5 inches wide. Within are two internal antennas. On the top, there’s a Motorola “M” logo. To indicate what’s going on in the box is a single small LED indicator on the front. When all’s well it shows solid white light. When there’s a poor connection, it shows amber. If you see a slowly blinking blue indicator it’s in setup mode. Rapid blue blinks? The unit’s upgrading its firmware. But, while it may not be pretty, it works well, and when it comes to Wi-Fi units that’s all I want.The units come with two gigabit Ethernet ports. You can use both ports as LAN ports, or you can use one for gigabit Ethernet backhaul. Personally, I always use cable for my backhaul whenever possible. Wi-Fi is getting faster, but you still can’t beat cable for sheer speed and low latency. Unlike higher-end mesh Wi-Fi gear, MH7603 doesn’t have either a web management user interface or a command-line interface. Instead, you must use an Android or iOS motosync mobile app. It’s a very simple app. It starts with a Network screen that shows icons for each node, and their connected devices. Tapping the icons enables you to see which devices are connected, their signal strength, and bandwidth usage. You can also reboot units and run a speed test. It’s as simple a network interface as you’ll ever see.On it, you’ll also find as you scroll down panels for Security, Full Home Filter, Connection, and Top Data Use. Again, they’re all very simple. The Security panel tells you if your network is secure, while the Full Home Filter panel blocks adult and malicious websites for all or some users. It also comes with adblocking. Now, for me, a former NASA network administrator, that’s nothing like enough control. But, this mesh network isn’t for me or anyone running even a basic business network. It’s for someone who needs a good, reliable home Wi-Fi network, and for those people, it does just fine.Setting it up is also mindlessly easy. You plug them in, create an account, and click “Set Up a New Device” on the Get Started screen. That’s pretty much it. You just follow the instructions. The most “technical” thing you’ll need to do is scan the QR code on the node’s base. Conclusions  The best news? For this, easy-to-use, solid, fast Wi-Fi mesh the MH7603 will cost you 238.97. You aren’t going to find its equal for cheaper.

    Featured reviews More

  • in

    How XDR provides protection against advanced exploits

    Damage caused by advanced exploits, such as Log4Shell and Spring4Shell, has been widely documented. These came out of nowhere and seemingly crippled many organizations. This happened despite record cybersecurity industry budgets that will clear $146B in 2022. This post from Palo Alto Networks highlights that, based on telemetry, the company observed more than 125 million hits that had the associated packet capture that triggered the signature. It certainly begs the question of why breaches are becoming more common and more damaging despite security spending at an all-time high. The answer to this lies in the approach many businesses have taken to threat protection. Traditional security is based on perceived best-of-breed products being used for specific functions. For example, firewalls protect the network, EDR protects endpoints, CASB protects the cloud, and so on. Most of these tools do a great job within their domains, but the reality is that exploits are not limited to one specific domain, so the silo-like nature of security creates many blind spots.Point products can’t see the end-to-end threat landscapeFor example, EDR tools are meant to find threats on endpoints, and they are effective at that specific task but have no visibility outside the endpoint. So if the breach occurred elsewhere, there is no way of knowing where and when. This is why so many EDR tools are excellent at detection but poor in response. The same can be said with firewalls that generally know everything that’s happening on a network but have no insight into an endpoint or many cloud services.Solving this problem lies in embracing the concept of XDR. Definitionally, I want to be clear that the X in XDR means “all” versus “eXtended,” the latter of which has been pushed by many of the point product vendors. Security pros need to understand that an upgraded EDR or SIEM tool is not XDR; it is merely a legacy tool with a little more visibility. XDR is the way forward for security True XDR is about taking data across the end-to-end infrastructure and correlating the information to find exploits and threats. This would allow for an exploit to be quickly identified and tracked across the infrastructure so all infected devices can be identified. While it’s impractical to assume that an organization would purchase all its infrastructure from a single vendor, I do believe that organizations should look to consolidate a minimum of network, endpoint and cloud security from a single vendor and treat that as the foundational platform for XDR. This would ensure that the vendor interoperates with other security providers to ingest the necessary data. Another benefit of XDR is that it provides a single source of truth across all security functions, which is vastly different from traditional security – where the security team has multiple tools, each with its own set of data and insights. The only way one could correlate the information is to do it manually, which is impossible today, given the massive amount of security data being collected. People can’t work fast enough, but an XDR solution, powered by artificial intelligence, can provide insights to a range of security analysts.XDR meets the needs of different security roles A good visualization of the value of XDR is depicted on Palo Alto Networks’ Log4j Incident Response Simulation page. It features three different SOC roles and how XDR can aid their jobs.  Specifically, the site does a deep dive on the following functions: Guy, the Threat Hunter: His job is to hunt for sophisticated attacks and those difficult to find low, slow threats that fly under the radar of traditional security tools. His job is to find unusual activities and other anomalies that are indicators of compromise. Cortex XDR makes threat hunting easier as it correlates data across endpoints, network, cloud and identity. Guy can then use an advanced XQL query language to aggregate, visualize and filter results that can quickly identify affected assets. Peter, the Tier 2 SOC Analyst: His function is to monitor, prioritize and investigate alerts. His work is used to resolve incidents and remediate threats. The problem is that most SOC tools provide far too many false positives making the information useless. This is why it’s my belief that the traditional SIEM needs a major overhaul. XDR uses machine learning and behavioral analytics to uncover advanced zero-day threats. Many SIEMS claim to do this, but most are just basic rules-based engines that need continual updating. With XDR, the investigation of the threats is accelerated by grouping-related alerts into incidents, and then the root cause is revealed through cross-data insights. Kasey, Director of Vulnerability Management: Her job is to discover, analyze the application, system, network and other IT vulnerabilities, and then assess and prioritize risk. Once that analysis is done, patching and resolving vulnerabilities can be performed. This is difficult, if not impossible, to do with point products because there is no way to understand the impact of a threat across systems. XDR can be combined with other tools, such as attack-surface management (ASM), to find and mitigate software vulnerable to Log4J and other exploits across the organization.In summary, I’ll go back to a conversation I had with a CISO a few months ago who told me that he finally understood that best of breed everywhere does not lead to best-in-class threat protection. In fact, the average of 30+ security vendors that businesses use today creates a management mess and leads to suboptimal protection. The path forward must be XDR, because it’s the only way to correlate historically siloed data to find threats and quickly remediate them before they cripple the business. 

    A good resource for security professionals, particularly Palo Alto Networks customers, is the upcoming Palo Alto Networks Symphony 2022, on May 18 and 19. While this is a vendor event, it’s filled with information on how to revamp security operations to keep them in line with current trends. More

  • in

    Cortex App to launch new Web3 content network this summer

    on May 2, 2022

    | Topic: Web3

    In its quest to make “Web3 available to everyone,” the Core team behind the newly-formed Cortex App said Monday that it’s introducing a new Web3 content network for launch in June. It will “bring Web2 functionality, such as social posts and blogs, into a decentralized and user-owned Web3.” This is the same group who launched free “.hmn” domain names on the Polygon protocol earlier this year.

    Cortex Network, according to the release on Monday, will afford users new levels of control and privacy for themselves and their content. What’s more, the network will enable new ways to collaborate and define payment models for NFTs and content. “In the Cortex Network, each page (URL), will be a wallet address where a user could receive tokens for their content, or send out tokens as well,” Leonard Kish, co-founder of Cortex App told ZDNet. “Each page will essentially be a store for data and a store for NFTs or other tokens,” he said.How it worksThe Cortex Network will act like a proof-of-stake blockchain, whereby publishers stake so-called CRTX tokens to validate user updates and then publish them over the Polygon network. A new kind of index, known as HDIndex, will create a hash that will act as an on-chain proof and a lookup to content updates. The press release claims that “when publishing on the Cortex Network, users will own their content as they control updates with their keys.”The goal for the Cortex Network is to simplify Web3 publishing, making it easier for current Web2 publishers to migrate to a user-owned Web3 content network.

    Networking

    The Network is based on a network architecture in which batches of updates (known as “commits”) contribute to a local state of content which, in turn, becomes part of the globally verifiable localized consensus. What all that means is that each commit contributes to a globally verifiable state for content with a complete history of the content at a particular web address. “In the Cortex system, URLs and crypto addresses are nearly synonymous as part of a human-readable namespace for keys that act as lookups to content,” according to the press release. Kish notes that when it comes to new ways to collaborate and define payment models for NFTs, the NFT domains (such as “kish.hmn”) and subdomains (“leo.kish.hmn”) on where the content lives are fully transferable, so an NFT can have a full story. And when transferred, that story (the content) can move with the NFT domain as well. “We are working on several ways to expand how NFTs work and the kinds of value they can transmit, and this is one,” Kish said. “Others are coming as well.”Barriers to break downPrice, complexity, scalability and consistency are four obstacles in the Web3 publishing progress, and the Cortex Content Network intends to overcome them. The Network will act as a “complete stack to enable not only a fast and reliable decentralized content environment for Web3, but scalable as well,” according to the Cortex App blog.  Further work is needed before the Network is ready for prime time. “We are working with partners now on testing elements of the network, but we don’t have an exact date,” Kish said. “We do expect to be able to provide an exact date for launch in the month of June.” More