Enterprise Management Associates and Pulse Secure report that 60% of organizations have accelerated their zero trust projects during the pandemic, while only 15% have slowed down.
Zero trust is a network security model that minimizes risk by applying granular policies and controls to network access and network communications. Zero trust operates via constantly verifying the legitimacy of network communications even inside the network perimeter. Changes in location, device state, security state, behavior, and more can initiate a re-authentication process.
SEE: Identity theft protection policy (TechRepublic Premium)
Pulse Secure, a provider of zero trust secure access solutions, released a report last month stating the COVID-19 pandemic has not impacted the adoption of zero trust technology globally. In fact, nearly two-thirds of organizations (60%) said they have accelerated zero trust implementation during the pandemic.
Furthermore, enterprise responses regarding their success with zero trust were quite positive; the majority (94%) indicated degrees of success, and half labeled their efforts as successful.
However, the survey found that collaboration is not without complications. Eighty-five percent of respondents in zero trust task forces and partnerships found themselves struggling with cross-team skills gaps (33%), a lack of tools and processes that might facilitate collaboration (31%), and budget conflicts (31%).
I spoke with industry experts Mike Riemer, global chief technology officer of Ivanti, an IT asset and service management software provider, and Amit Bareket, co-founder and CEO of SaaS provider Perimeter 81, to learn more about zero trust.
Scott Matteson: What are some subjective examples of zero trust in action?
Mike Riemer: As businesses added capacity to support remote office accessibility, companies have had to cope with amplified security threats stemming from increased use of personal computing, home office and public networks, and cloud applications. Over the past year, the vast majority of enterprises saw an increase in incidents related to phishing and identity theft, susceptible and unmanaged endpoints, and insecure connections. Traditional corporate perimeter defenses are not going away but have morphed to cloud and edge computing.
Organizations are optimizing their investments to address the new normal of a hybrid, flexible workplace, with a focus on user experience, administration ease, end-to-end visibility and adaptive threat response. The sheer volume of users, devices, resource and application access, as well as the dynamics of user, resource and application provisioning is driving investment in zero trust network success and longer-term planning. To mitigate on-going unauthorized access, malware and data breach risks, organizations are accelerating the coordination of security controls that enable the zero trust tenets of user and device and security posture verification and applying condition access based on continuous risk assessment.
SEE: How to manage passwords: Best practices and security tips (free PDF) (TechRepublic)
Amit Bareket: One example of zero trust in action is in managing contractors or workers that are remote. Today, these types of employees pose an easy and simple target for hackers. They are using their own devices and often exposing them to public Wi-Fi. Zero trust can enforce extremely restricted access to these workers and fight off security holes. Instead of every worker having access to the entire corporate network and resources, zero trust enables businesses to limit access only to the resources specific employees need to do their daily jobs.
Enforcing a zero trust approach limits the attack surface as attackers won’t be able to exploit within the network and gain access to the more critical and sensitive resources. As contractors and remote workers are the lowest hanging fruit for hackers, zero trust ensures these users won’t have deep access for attackers to exploit. Additionally, zero trust can allow specific users such as CEOs and CISOs to have high-privilege access that will allow these users to have “your eyes only” access.
Scott Matteson: What are the requirements for a zero trust implementation (hardware, software, policies, etc.)?
Mike Riemer: Organizations must start by taking inventory of all user and application access conditions, resources, and data protection obligations. Once everything has been accounted for, the next step is determining the key business needs for direct, private application access, including whether the staff can appropriately address meeting user and application access capabilities and the associated security policies. It is also important to identify what applications and use cases are not supported or require workarounds, including those that are legacy or latency-sensitive. These steps will help organizations determine whether they can take on managing ZTNA software themselves or if they will instead be contracting with a SaaS-based solution.
The next phase involves reviewing key takeaways identified during the above assessment to determine how easy and economical it will be to purchase, deploy and manage the ZTNA solution in conjunction with other secure access mechanisms. For example, as an organization moves to cloud-delivered security, to what extent will its current hybrid IT infrastructure, services and locations be supported? Knowing this information is crucial to building out a successful program.
Amit Bareket: There is a common misunderstanding that zero trust is both costly and complicated to implement, but as more businesses are moving their infrastructures to the cloud, it has become easier to take advantage of the benefits of zero trust.
Zero trust is an approach and not a product, but this doesn’t mean you don’t need to have the right product and policies in place to enforce the implementation. From identity and access management, to cloud security brokering and SIEM event solutions, each factor will help businesses to have a more successful implementation.
Scott Matteson: What should IT administrators do to prepare themselves to implement, administer and maintain a zero-trust implementation?
SEE: Cybersecurity: Let’s get tactical (free PDF) (TechRepublic)
Mike Riemer: Organizations need to consider the extent to which their applications and services can and will be moved to the cloud. There are also investment and process details to consider. Many organizations have made a sizable investment in VPN and virtual desktop infrastructure (VDI) solutions based on the knowledge that the technology would work well with their hybrid IT infrastructure, would be convenient for users and administrators to manage, and would support their existing applications and security ecosystem. Furthermore, that investment decision is also aligned within their budget and depreciation expectations. As such, the majority of organizations will need to determine how to offset the investment. Ideally, the most pragmatic approach would be to seek ZTNA solutions that can co-exist with their other secure access investments, providing greater deployment flexibility as enterprises migrate applications to the private and public cloud and adopt edge-based services to address workplace flexibility and digital business requirements.
Amit Bareket: When implementing a zero trust security architecture, IT managers must isolate resources within their IT infrastructure in the form of micro-segmentation. Forrester Research recommends dividing network resources at a granular level, allowing organizations to tune security settings to different types of traffic and create policies that limit network and application flows to only those that are explicitly permitted. This network micro-segmentation approach allows security teams the flexibility to apply the right level of protection to a given workload based on sensitivity and value to the business.
Scott Matteson: What should end users be trained on in order to rely upon zero trust solutions?
Mike Riemer: A zero trust approach would ensure that employees’ devices are secure and meet corporate security policies, prior to any intellectual property being allowed onto the device, or to flow through the device. Enhanced security policies seamlessly enforced on employees devices, particularly remote connectivity, continues to be at an all-time high, will give employees the ability to enhance the entire enterprise security posture as endpoints—even new ones introduced during remote work—are protected.
Amit Bareket: When training end users with new zero trust solutions they should be introduced with its key guidelines, “never trust, always verify.” IT managers should train end-users to understand the different features that are integrated within the solution. With zero trust networks, multi-factor authentication is used to verify identities and then manage access to data based on the user’s “need to use.” Additionally, end-users should implement different complex passwords and adopt single sign-on features for a more secure end-user experience.
Scott Matteson: Are there any security or operational considerations involved?
Mike Riemer: When it comes to zero trust network access, organizations need to consider to what extent their applications and services can and will be moved to the cloud. There are also investment and process details to consider. Many organizations made a sizable investment in VPN and VDI solutions based on knowing that the technology would work well with their hybrid IT infrastructure, would be convenient for users and administrators to manage, and would support their existing applications and security ecosystem. Furthermore, that investment decision also aligned within their budget and depreciation expectations. As such, the majority of organizations will need to determine how to offset this investment. Ideally, the most pragmatic approach would be to seek ZTNA solutions that can co-exist with their other secure access investments which will provide greater deployment flexibility as enterprises migrate applications to private and public cloud and adopt edge-based services to address workplace flexibility and digital business requirements.
Amit Bareket: Zero trust offers more than just another layer of security against hackers. Zero trust delivers considerable business benefits such as greater network visibility, reduced IT complexity, less demanding security workloads, data protection, a superior user experience, and support for cloud migration. These benefits come with different operational considerations where employees’ access will need to be redesigned to fit an implemented zero trust model on a network. DevOps, IT, and Security teams will be heavily involved to ensure that across the business the newly implemented zero trust model is being adopted correctly.
Scott Matteson: How is this trend expected to evolve down the road?
Mike Riemer: In July 2020, bad actors leveraged social engineering techniques, which involves manipulating people into giving up sensitive information, in order to pose as internal IT staff and convince Twitter employees working from home to enter their login information. The phishing attack resulted in numerous high-profile Twitter accounts, like Barack Obama and Elon Musk, being hacked. Twitter was ultimately found to have insufficient internal controls and a lack of cybersecurity regulation, which contributed to the incident.
The brazen nature of the Twitter attack shows bad actors are using social engineering to raise the stakes, and we can expect to see more of these high-profile orchestrated events in 2021 as remote work continues and cyber criminals look for new, creative ways to infiltrate organizations. The incident represents a new focus on remote users and remote connectivity, whether through VPN tunnels or other remote connectivity forms. In response, companies must prepare now with the appropriate end-user education and adopt an adaptive risk and trust threat assessment mentality. This can be accomplished by adopting a zero trust approach founded on the principles of continuous verification and authorizations that allow organizations to have better visibility and insight into what is, and is not, typical behavior for an employee.
Amit Bareket: The current shift to remote work has increased the need to adopt zero trust, but the truth is this trend has long been in the works. The future of security will push for companies to implement more cloud-based security solutions with user identification. This will increase the usage of zero trust, providing businesses a more flexible and scalable option than traditional network security solutions.
Instead of businesses enforcing their employees in connecting to a VPN or a firewall, they will invest more heavily in more user-centric zero trust solutions which provide flexibility and more modern cloud-based security.
AMD Radeon RX 6800 XT and Radeon RX 6800 Review
AMD has for several years now lagged behind Nvidia in the graphics performance race, particularly at the high end where ray tracing has become the new must-have feature to show off, and buyers always want the latest and greatest. The company’s problems are compounded in India, where unfavourable pricing and limited distribution often tip the scales in Nvidia’s favour even when performance is comparable. Just like it did with its Ryzen desktop CPUs a few years ago, AMD has been working on turning its fortunes around with new Radeon GPUs built around the RDNA architecture and its successors.
With the new Radeon RX 6000 series based on the RDNA 2 architecture, AMD thinks it can take Nvidia on even at the very top end. Three GPUs have been launched so far: the flagship Radeon RX 6900 XT which hopes to take on the GeForce RTX 3090; and the Radeon RX 6800 XT and Radeon RX 6800, which are for gamers who want great quality at 4K but don’t have infinitely stretchable budgets. This is the most confident that AMD has been in a very long time, and yes, there’s even ray tracing – or at least an early implementation of it.
Today, we’re testing and reviewing the Radeon RX 6800 XT and Radeon RX 6800. These GPUs were launched in late 2020 but as expected, availability in India has been rather constrained. Does their performance justify going against the grain and choosing them over competing Nvidia offerings? Can AMD pull off a decisive victory after all? Let’s find out.
AMD RDNA 2 architecture
While it might have lagged in terms of PC market share, AMD’s graphics division has dominated the high-end gaming console space for years now. The current and previous generations of Sony’s PlayStation and Microsoft’s Xbox consoles have all used some form of Radeon GPU, which does give AMD some advantages and insight into what capabilities gamers and game developers will be using over the next several years.
Last year’s Radeon RX 5700 XT, based on the RDNA 1 architecture, was a significant improvement over its predecessors, delivering 50 percent better performance per Watt. AMD now says that RDNA 2 can achieve 54 percent better performance per Watt than even that. The RDNA2 chip on which these two GPUs are based has over 26 billion transistors, though each GPU model uses different amounts of these resources.
At the architecture level, the number of Compute Units (CUs) has doubled, going from 40 to 80, and new ray accelerator blocks have been added for ray tracing. Even though RDNA 2 uses the same 7nm manufacturing process as RDNA, optimisations to power handling, the processing pipeline itself, and boosting clock speeds by up to 30 percent has resulted in this generation-on-generation performance uplift.
A big part of that comes from what AMD is calling “Infinity Cache”. This high-density 128MB L3 cache is based on AMD’s Epyc server CPU designs. It’s said to be much faster than accessing GDDR6 RAM and is integrated into the on-die Infinity Fabric interconnect for high-bandwidth, low-latency communication. The Infinity cache helps keep data ready for the GPU so it can work at high frequencies without getting bottlenecked, which is an advantage when dealing with 4K-quality game assets.
Another new feature, which AMD calls “Smart Access Memory”, essentially allows a computer’s CPU to directly access the entire contents of the GPU’s memory. This has so far been limited to 256MB because of the nature of the PCIe connectivity protocol, but PCIe 4.0 now allows for what’s called a resizable base address register (BAR). Smart Access Memory will require a Ryzen 5000 series desktop CPU and Radeon 6000 series GPU, plus a 500-series motherboard, and of course all the latest BIOS and driver updates. This is said to deliver up to 11 percent better performance in some games, and developers could optimise better for it in the future.
Along with ray tracing, the DirectX 12 Ultimate API introduces DirectStorage, which lets a GPU load game assets directly from a high-speed SSD to GPU memory, without going through system RAM or requiring CPU resources. This is being applied in next-gen consoles and should improve loading time for levels and transitions in games.
The updated HDMI 2.1 standard will allow for 4K 144Hz or 8K 60Hz video output, and AMD says it’s ready for Display Stream Compression (DSC), which should allow for even higher resolutions. If you stream your gameplay online, you might benefit from 8K AV1 decode and 8K HEVC encode in hardware.
The Radeon Software utility now supports automatic as well as fully manual overclocking. In addition, Performance Tuning Presets will let users apply optimised per-game profiles, which AMD says will not void a graphics card’s warranty. Radeon Anti-Lag and Radeon Boost are both features that can potentially make games feel more responsive by optimising input latency and dynamically reducing resolution in small patches.
AMD Radeon RX 6800 XT and Radeon RX 6800 specifications
4K is the target resolution for all games at high quality, even for the Radeon RX 6800 – lower-priced GPUs will presumably be launched later this year for those who are targeting 1440p and 1080p, but the three models launched so far all target enthusiast-class gamers. The baseline for the Radeon RX 6800 is 4K 60fps in nearly all current-day games, and it’s positioned as an improvement over the GeForce RTX 2080 Ti, which means it will take on the GeForce RTX 3070. You should also see over 90fps at 1440p, according to AMD.
The higher-priced Radeon RX 6800 XT is set to compete with the GeForce RTX 3080, staying above 4K 60fps in today’s AAA games even at the highest quality settings, or delivering 144-240fps at 1440p in many esports titles.
However, enabling ray tracing will take a toll on these figures. Even AMD’s official projected performance numbers indicate that 1080p and 1440p are more realistic expectations if you want to stay at or above 60fps. Nvidia gets around this using its machine-learning-powered upscaling algorithm DLSS (Deep Learning Supersampling), which lets its GPUs render lower resolutions and then stretch them out without losing quality. AMD has something similar in the works, but it isn’t ready yet. The good news is that the GPU’s ray tracing hardware isn’t just for games; it will help speed up work in content creation applications such as Blender and Autodesk Maya.
The Radeon RX 6800 features 60 active CUs with one ray accelerator and 64 stream processors each, for a total of 3,840 stream processors. It runs at burst speeds of up to 2105MHz. Its bigger sibling, the Radeon RX 6800 XT, has 72 CUs with an equal number of ray accelerators and 4,608 stream processors, and runs at up to 2250MHz. Both feature 16GB of 256-bit GDDR6 RAM with a peak bandwidth of 512GBps. The two have rated TDPs of 250W and 300W, and AMD recommends 650W and 750W power supplies respectively.
AMD Radeon RX 6800 XT and Radeon RX 6800 reference design
AMD’s partner brands, which include Asus, ASRock, Gigabyte, MSI, Sapphire, XFX, and Powercolor are free to design their own graphics cards featuring Radeon GPUs, including ones with AIO liquid coolers. However, quite a few of the cards you’ll see in the market will use AMD’s reference design. Sapphire in particular, which is one of the more easily available brands in India, will be selling cards identical to the reference units that we are reviewing today.
AMD doesn’t sell its own cards like Nvidia does with its Founders Edition models, but a lot of work still goes into their design. The Radeon RX 6800 XT and Radeon RX 6800 reference cards are heavy and feel solidly built. They both have vapour chamber coolers and a die-cast aluminium shroud with three custom-designed axial fans. The fans can spin down completely when the GPU isn’t being stressed.
All this is to optimise acoustic as well as cooling efficiency, and you should find these cards quieter than the previous generation. Even so, this design is also claimed to allow for some degree of overclocking to keep enthusiasts happy. AMD says it has used a graphite pad to interface between the GPU itself and the cooler, rather than thermal compound which can dry out over time. There are also thermal pads between the cooler, GDDR6 memory, and power circuitry for better heat conductivity.
AMD has used standard 8-pin PCIe power connectors, as opposed to Nvidia’s recent shrunken standard that requires an adapter. Both the Radeon RX 6800 XT and the Radeon RX 6800 have two 8-pin connectors facing upwards at the rear.
The Radeon RX 6800 XT reference design occupies 2.5 slots while the RX 6800 is somewhat slimmer. Both are a standard 10.5 inches long. AMD has used its signature red colour as an accent around the exposed heatsink vents on the top and there’s a red illuminated Radeon logo on each card.
Both cards have one HDMI 2.1 port, two DisplayPort 1.4 ports, and oddly, a USB Type-C port. AMD says this allows for single-wire connections to head-mounted displays since it can supply a video signal plus up to 27W of power. If you already have a triple-monitor setup, you might need some adapters. The decision to go with USB paves the way for future monitors that can be daisy-chained and have a USB hub or other integrated peripherals, but it’s an interesting choice seeing as how the VirtualLink standard is effectively dead, and even Nvidia removed this port from its Founders Edition cards after just one generation.
AMD Radeon RX 6800 XT and Radeon RX 6800 performance
All tests were run using an AMD Ryzen 2700 CPU, ASRock X470 Taichi Ultimate motherboard, 2x8GB of G.Skill F4-3400C16D-16GSXW DDR4 RAM, a 1TB Samsung 860 Evo SSD, and a Corsair RM650 power supply. The monitor was a 4K Asus PB287Q. Windows 10 was updated to the 2H20 release and all drivers were up to date including the Radeon Software Adrenalin 2020.12.2 release.
Synthetic tests let us compare different GPUs directly. 3DMark is the obvious place to start. The current-gen Time Spy and Time Spy Extreme tests show performance scaling fairly predictably with price across the Nvidia and AMD camps. The GeForce RTX 30-series does pull ahead in the Port Royal DirectX 12 test, while the Radeon RX 6800 siblings have the advantage in the older Fire Strike tests. Unigine’s Superposition benchmark, running at its 4K Optimised preset, also scales with price. If all graphics cards were available at their MRPs, there would be no bad choices, only feature- and budget-related decisions.
|AMD Radeon RX 6800||AMD Radeon RX 6800 XT||Asus TUF Gaming GeForce RTX 3070||Nvidia GeForce RTX 3080 Founders Edition||Zotac GeForce RTX 2080 Amp|
|3DMark DLSS Feature Test (off / on)||N/A||N/A||37.54fps / 88fps||50.33fps / 69.41fps||NA|
|3DMark Port Royal||7,425||8,764||8,083||10,721||NA|
|3DMark Time Spy||12,469||13,723||11,857||13,996||9,505|
|3DMark Time Spy Extreme||6,068||6,798||5,912||7,191||NA|
|3DMark Fire Strike Extreme||17,980||20,205||15,415||17,787||12,215|
|3DMark Fire Strike Ultra||10,063||11,852||8,587||10,520||6,498|
|3DMark Wild Life||72,014||73,249||N/A||N/A||N/A|
|3DMark Ray Tracing Feature Test||21.28fps||26.31fps||N/A||N/A||N/A|
|Unigine Superposition (4K Optimised)||11,468||13,135||11,060||13,708||NA|
Coming to in-game benchmarks, Far Cry 5 at 4K using its Ultra preset, and managed to hit averages of 80fps on the Radeon RX 6800 XT and 72fps on the RX 6800. Middle Earth: Shadow of War also ran smoothly at 4K with its Ultra quality preset, averaging 73fps on the Radeon RX 6800 but jumping up to 82fps on the Radeon RX 6800 XT. Metro: Exodus also ran at 4K with its Ultra preset (which is not the highest in this case), without ray tracing enabled, at 61fps and 52.35fps on the two cards respectively.
In manual gameplay, Doom: Eternal was extremely smooth at 4K at Ultra Nightmare quality, reaching 150-160fps on the Radeon RX 6800 XT and 120-125fps on the Radeon RX 6800, as measured by the in-game frame counter even in heavy battle scenes. The Witcher 3: Wild Hunt averaged 84fps and 73fps respectively, at 4K Ultra.
I then turned to Battlefield V to check how AMD’s ray tracing implementation affects performance. With no DLSS equivalent, there’s no way to raise the frame rate to compensate for the additional stress that ray tracing puts on the new Radeon GPUs. With the resolution set to 4K and using the Ultra quality preset, the Radeon RX 6800 XT managed around 105fps in a busy scene with plenty of gunfire and running around. With ray tracing enabled, that fell steeply to around 60fps. The Radeon RX 64800 showed a sharper drop, from around 95fps to just 40fps. You’d have to lower the resolution and compromise on quality if you want ray tracing effects, at least for now.
AMD’s new cooler design does seem to be effective, and neither of the two cards tested was noisy even under heavy load.
AMD has clearly done very well with this generation of Radeon GPUs. They’re undoubtedly powerful enough for today’s enthusiasts who demand smooth visuals at 4K and at very high quality settings. The reference design cards are slick and unobtrusive. However, the matter of whether the new Radeon RX 6800 XT and Radeon RX 6800 are capable of challenging Nvidia’s dominance comes down to two factors – features and pricing.
In terms of features, ray tracing of course stands out. AMD’s implementation isn’t fully realised yet. Plenty of games do support AMD’s approach, and it’s likely that future upscaling tech will make things a lot more even between the two competitors. For now though, if you want to enjoy all the graphical bells and whistles, and claim bragging rights, Nvidia has the advantage. One small note here is that 16GB of GDDR6 RAM is much more than what Nvidia’s current offerings at this price and performance level have, and your use cases might benefit from that.
Finally, we come to the thorny – and ongoing – issue of Radeon prices in India. To be fair, all current-gen GPUs are scarce, and there has been significant price gouging and scalping for the past several months. It’s difficult to make comparisons since prices keep changing and some models simply aren’t available. However, Nvidia does sell its own Founders Edition graphics cards in India at their official MRPs, through a distributor. Although you have to go through a tedious waitlisting process and a lot depends on luck, at least there are periodic updates. AMD has no such arrangement.
AMD’s official prices in India for the Radeon RX 6800 and RX 6800 XT graphics cards are Rs. 45,999 and Rs. 64,990 respectively, not including taxes. In the US, they cost $579 and $649 respectively. It’s strange is that the company is using a much higher exchange rate for the RX 6800 XT, which makes this card disproportionately more expensive than its sibling. On top of that, street prices (including tax) in India are all over the place, ranging from a somewhat understandable Rs. 59,999 for a reference Sapphire RX 6800 (listed at only one online retailer but out of stock, with no guarantee it will actually be up for purchase that price) to an outrageous Rs. 1,10,000 for a Sapphire Nitro+ Radeon RX 6800 XT with a custom cooler. In fact, some Radeon RX 6900 XT listings are actually less expensive.
This should have been a close call between Nvidia and AMD, but the result of this supply-side confusion is that it’s nearly always going to make sense for gamers in India to choose the equivalent GeForce 30-series card. The Radeon RX 6800 might be worth it if you can find it at a good price, but there’s little hope for the Radeon RX 6800 XT here right now – maybe we’ll see significant discounts somewhere down the line, at which point calculations can be reconsidered. AMD does seriously need to sort this out, or it will lose despite whatever progress it has made in terms of performance.
AMD Radeon RX 6800 (reference)
Price: Rs. 45,999 + taxes
- Excellent stock cooler
- Very good performance at 4K and 1440p
- No resolution upscaling at launch
- Expensive and hard to find
Ratings (out of 5)
- Performance: 4
- Value for Money: 4
- Overall: 4
AMD Radeon RX 6800 XT (reference)
Price: Rs. 64,990 + taxes
- Excellent stock cooler
- Very good performance at 4K and 1440p
- No resolution upscaling at launch
- Expensive and hard to find
Ratings (out of 5)
- Performance: 4.5
- Value for Money: 3.5
- Overall: 3.5
How leaders across industries see 5G helping their businesses
Verizon’s 5G Business Report found that everyone’s excited about 5G, but the reasons behind the buzz differ between industries.
A report from Verizon on business leaders’ opinions of 5G finds that 5G adoption is well underway across industries, but the reasons for excitement and the ways in which businesses plan to deploy 5G tech vary greatly. The report surveyed 700 business tech decision-makers, and found that 55% had heard, read, or seen a lot about 5G, and 80% believe it will create new opportunities for their companies. The belief in 5G benefits for business extends to believing that 5G will benefit their individual industries and roles, with 79% saying they agreed with both statements.
There was some split between IT leaders and C-level executives on whether 5G is a top priority: 54% of IT leaders said it was, while only 39% of the C-suite agreed. Another large difference appeared between business leaders and the general public: As mentioned above, 55% of business tech decision-makers said they had heard a lot about 5G, while only 23% of US adults said the same. This could indicate a knowledge gap that drags 5G progress down, or otherwise slows customer adoption of the new technology. Regardless, business leaders seem eager to incorporate 5G into their organizations, both internally and externally.
SEE: TechRepublic Premium editorial calendar: IT policies, checklists, toolkits, and research for download (TechRepublic Premium)
The report breaks businesses down into five categories: Sports, entertainment, and media; government and the public sector, healthcare, manufacturing, and retail. 5G applications for each industry vary, and survey responses did as well.
In the entertainment, sports, and media industries, most of the excitement comes from the sheer amount of bandwidth that 5G will be able to deliver. Eighty-four percent of respondents said they believed 5G would eliminate “miles of cables and wiring,” and the same amount said they were excited by high-bandwidth connections allowing for multiple broadcast or video streams. Those numbers were paired with the likelihood of adoption, with 80% saying they were likely to use 5G to eliminate wiring and 74% planning to take advantage of increased bandwidth to increase streaming capabilities.
The public sector sees 5G value in real-time video surveillance and public safety programs. Seventy-four percent said they see that technology as valuable, though only 58% said they were likely to roll that sort of technology out in the next two years. Public sector decision-makers were also excited by the prospect of increasing data transfer speeds to emergency services to reduce response time and smart city sensor networks to improve water availability, air quality, and energy-efficiency monitoring.
In the healthcare world, remote health monitoring technology leads as the most valuable tech, with 81% saying they find its potential valuable. Seventy-five percent said they’re planning to implement such technology. Other healthcare uses include using 5G mobile networks for telemedicine visits, real-time medical image sharing, and real-time wearable devices.
SEE: Future of 5G: Projections, rollouts, use cases, and more (free PDF) (TechRepublic)
Manufacturers were most excited by real-time supply chain tracking, with 88% intrigued by the prospect and 82% saying they were likely to employ such technology. Artificial intelligence (AI) designed to support worker safety also ranked high, with 83% saying it would be valuable, and 74% planning to implement it in the next two years.
Retail was most interested in real-time data processing that would “maximize efficiency from point of sale to product delivery,” with 82% planning to use similar programs, and the ability to analyze foot traffic to dynamically plan displays to maximize product positioning efficiency. Augmented reality (AR) shopping also proved high on the interest list with 75% saying they planned to use augmented reality apps, and 73% saying they were likely to use AR for product visualization.
In summing up the report Verrizon Business CEO Tami Erwin said the data points to 5G being seen as a serious transformative element. “Over the last year, 5G has become top-of-mind for businesses as they manage through condensed digital transformation timelines. These findings underscore the critical role 5G will play in economic recovery and growth, and we stand committed and ready to help our partners make that transition quickly and seamlessly,” Erwin said.
iPhone SE Plus Rumoured to Be in the Works, Price and Specifications Leak
iPhone SE Plus is reportedly in the works, a new leak suggests. Apple introduced the iPhone SE (2020) last year and the tech giant could be working on a new model in the affordable ‘SE’ series. The iPhone SE Plus pricing and specifications have leaked online alongside a render that hints at the design of the upcoming phone. Apple may introduce the iPhone SE Plus around the same time as the iPhone SE (2020) last year, in April.
A tipster called @aaple_lab has leaked key specifications and pricing of the rumoured iPhone SE Plus. The phone is expected to be priced around $499 (roughly Rs. 36,300), which is $100 more than the launch price of the iPhone SE (2020). A render leaked alongside shows a wide notch on top of the display and a single rear camera. There is no physical home button on the iPhone SE Plus, a big change from the iPhone SE (2020) that has thick bezels and a physical home button. The tipster claims that the phone may launch in Black, Red, and White colour options.
Coming to the specifications, the iPhone SE Plus is tipped to feature a 6.1-inch IPS display and could be powered by either Apple A13 Bionic or Apple A14 Bionic chip. The rear camera is tipped to sport a 12-megapixel iSight sensor, whereas the selfie camera is tipped to feature a 7-megapixel resolution sensor. Camera features include six portrait light effects, OIS, and Smart HDR 3. The phone is tipped to come with an IP67 rating for dust and water resistance.
The tipster vaguely suggests that the Touch ID could be integrated into the Home button. This could be a reference to the Power button on the side, but it’s not very clear.
There is no clarity on when this rumoured iPhone SE Plus will launch, but if we were to speculate, it could launch sometime in April — around the same time the iPhone SE (2020) was launched last year. Apple has made no official announcements about the phone yet.
- Technology8 months ago
First iPhone jailbreak in four years released
- Technology6 months ago
Is OnePlus Nord the Best Phone Under Rs. 30,000?
- Technology8 months ago
The Complete Guide for Building a Website
- Technology8 months ago
Check out the new Gaming Leader: Playstation 5
- Entertainment6 months ago
Jack Harlow Denies JW Lucas’ Credit in Hit ‘Whats Poppin’ After Controversial Breonna Taylor Remarks
- Entertainment6 months ago
Gwyneth Paltrow Names Rob Lowe’s Wife as Her Mentor in Giving Blow Job
- Entertainment6 months ago
Billie Eilish Reflects on Self-Growth on Sweet New Song ‘My Future’
- Entertainment6 months ago
Grimes Slams Baby Daddy Elon Musk After He Tweets ‘Pronouns Suck’