Connect with us

Annual report measures how fast the digital transformation is going in 63 countries and scores each one on knowledge, tech, and future readiness.

Image: IMD

The US held onto its top spot in a global ranking of digital competitiveness thanks to access to capital and an emphasis on robotics. Singapore also stayed in the No. 2 spot on the IMD World Digital Competitiveness Ranking (WDCR) for 2020. The country earned more No. 1 rankings than the US in the rankings on individual criteria. The WDCR measures the capacity and readiness of 63 economies to adopt digital technologies for economic and social transformation. This year’s Top 10 list looks very similar to last year’s:

  1. United States of America
  2. Singapore
  3. Denmark
  4. Sweden
  5. Hong Kong SAR
  6. Switzerland
  7. Netherlands
  8. South Korea
  9. Norway
  10. Finland

Researchers identified these trends in each country in the top 10 list: efficient use of digital talent, a reflection of having the technological infrastructure in place and using the technology available. Researchers believe that the fate of economies today can be predicted by how they manage digital transformation and that COVID-19 was a litmus test for the world.

SEE: Chatbot trends: How organizations are leveraging AI chatbots (free PDF) (TechRepublic)

Economist Mariana Mazzucato, director of the Institute for Innovation and Public Purpose at University College, London, said in a press release that digitalization is no longer an option but a necessity. 

“Setting bold national targets—including digitalizing public services—creates a dynamic process by which the private sector can scale up through servicing bold procurement programs,” she said. “Europe’s recovery will depend on it.” 

America’s emphasis on education and research and development pushed the country to the top of the list this year. The US has also seen an increase in e-participation from citizens this year. Singapore’s ranking is due to the talent and regulatory framework ranking indicators. Sweden and Denmark stand out for high scores in knowledge building which covers making economies productive and efficient for citizens. 

The Republic of Korea, Denmark, and the U.S. are strongest with individual adaptive attitudes, while Taiwan-China, the US, the Republic of Korea, and China did best with business agility. North America is a leading region for robotics education with Singapore in a close second.

The UK came in at No. 13 overall and its scores on the individual criteria have held steady over the years. The country’s biggest strength is knowledge and future readiness. The report identified the net flow of international students, robotic research, and R&D productivity by publication as the country’s biggest strengths. Employee training is a weakness.

Australia is in the No. 15 spot with technology scoring highest among the three criteria. This country’s top strengths include investment in telecommunications, mobile broadband subscribers, and tablet ownership. Weaknesses include agility of companies and high-tech patent grants.

The report includes a detailed report on each country that includes scores from the last four reports, scores on the factors and subfactors, and strengths and weaknesses.

The Institute for Management Development evaluated countries on three criteria:

  • Knowledge: Talent, training and education, and scientific concentration
  • Technology: Regulatory framework, capital, and technological framework
  • Future readiness: Adaptive attitudes, business agility, and IT integration

The US dropped from a ranking of 3 to 10 on the competitiveness criteria. Also, the Technology ranking dropped from 5 in the 2019 report to 7 in the 2020 version. The US ranking on regulatory framework has dropped steadily over the years since scoring a 12 in 2016 and a 22 in 2020. 

Singapore scored first in the both regulatory framework and technological framework but 11 in capital, where the US held the No. 1 ranking.

The report found that Eastern Asia, Western Europe and North America are the most progressive regions in terms of digital capabilities. Latin America, Central Asia, and Eastern Europe have the most work to do to catch up.

The data that informs the rankings comes from research done in 2019 and a survey that was conducted after the pandemic started in 2020. The survey did not include questions about COVID-19 but the analysts considered the role technology played in how countries responded to the crisis.

The IMD works with partner institutes around the world to produce the report.

Also see

Source link

0
Continue Reading

Technology

Targeting Aging is the Way to Treat Diseases of Aging

1 31

Near all work to date on the treatment of age-related disease has failed to consider or target underlying mechanisms of aging, the molecular damage that accumulates to cause pathology. It has instead involved one or another attempt to manipulate the complicated, disrrayed state of cellular metabolism in late stage disease, chasing proximate causes of pathology that are far downstream of the mechanisms of aging. This strategy has largely failed, and where it has succeeded has produced only modest benefits. Consider that statins, widely thought to be a major success in modern medicine, do no more than somewhat reduce and delay mortality due to atherosclerosis. They are not a cure. The mechanisms of aging are why age-related diseases such as atherosclerosis exist. They are the root cause of these diseases. Attempted therapies that continue to fail to target the mechanisms of aging will continue to fail to deliver meaningful benefits to patients. This must change.

Targeting Aging is the Way to Treat Diseases of Aging

Image credit: Pixabay (Free Pixabay license)

Aging doesn’t kill people – diseases kill people. Right? In today’s world, and in a country like the United States, most people die of diseases such as heart attack and stroke, cancer, and Alzheimer’s. These diseases tend to be complex, challenging, difficult, and extremely ugly to experience. And they are by nature chronic, caused by multifactorial triggers and predispositions and lifestyle choices. What we are only now beginning to understand is that the diseases that ultimately kill us are inseparable from the aging process itself. Aging is the root cause. This means that studying these diseases without taking aging into account could be dangerously misleading … and worst of all, impede real progress.

Take Alzheimer’s disease. To truly treat a disease like Alzheimer’s, we would need to identify and understand the biological targets and mechanisms that trigger the beginning of the disease, allowing us to intervene early – ideally, long before the onset of disease, to prevent any symptoms from happening. But in the case of diseases like Alzheimer’s, the huge problem is that we actually understand very little about those early targets and mechanisms. The biology underlying such diseases is incredibly complex. We aren’t sure what the cause is, we know for sure there isn’t only one target to hit, and all prior attempts to hit any targets at all have failed. When you start to think about how much of what we think we know about Alzheimer’s comes from very broken models – for example, mice, which don’t get Alzheimer’s naturally – it becomes totally obvious why we’re at a scientific stalemate in developing treatments for the disease, and that we’ve likely been coming at this from the wrong direction entirely.

The biggest risk factor for Alzheimer’s isn’t your APOE status; it’s your age. People in their twenties don’t get Alzheimer’s. But after you hit the age of 65, your risk of Alzheimer’s doubles every five years, with your risk reaching nearly one out of three by the time you’re 85. What if going after this one biggest risk factor is the best vector of attack? Maybe even the only way to truly address it? This isn’t about the vanity of staying younger, about holding on to your good looks or your ability to run an 8 minute mile. It’s about the only concrete possibility we have to cure these diseases. Instead of choosing targets for a single specific disease, i.e. a specific condition that arises in conjunction with aging, we can get out in front of disease by choosing targets that promote health. And we can identify these by looking at disease through the lens of the biology of aging.

Link: https://a16z.com/2020/10/07/aging-alzheimers-drug-discovery/

Source: Fight Aging!




Source link

0
Continue Reading

Technology

The Mandalorian Season 1 Recap Distills the Star Wars Series Into 89 Seconds

mandalorian season 1 1603956532089

Before The Mandalorian season 2 premieres Friday afternoon on Disney+ Hotstar (and Friday midnight on Disney+ in the US), Disney and Lucasfilm have given us an official 89-second recap of The Mandalorian season 1. That’s very brief, but it speaks to the fact that The Mandalorian wasn’t a narratively-heavy show on its debut last year.

Everything You Need to Know About The Mandalorian Season 2

The Mandalorian season 1 recap touches upon Mando’s (Pedro Pascal) profession (he’s a bounty hunter), his newest target (Baby Yoda), the people he meets along the way — Cara Dune (Gina Carano), Greef Karga (Carl Weathers), and Kuiil (voiced by Nick Nolte) — and the consequences of his decision to bring Baby Yoda under his wing.

“You have something I want. It means more to me than you will ever know,” the darksaber-wielding villain Moff Gideon (Giancarlo Esposito) says deep into The Mandalorian season 1 recap, as we are given a reminder of the Star Wars series’ action-heavy side. Gideon then declares: “It will be mine.”

The season 1 recap wraps by setting up The Mandalorian season 2, as tribe leader The Armorer (Emily Swallow) instructs Mando to reunite Baby Yoda “with its own kind”. Mando wonders: “You expect me to search the galaxy for the home of this creature?” Well, yes, otherwise what would we do in season 2, Mando.

In addition to Pascal, Carano, Weathers, and Esposito, The Mandalorian season 2 also stars Omid Abtahi as Dr. Pershing, Horatio Sanz as Mythrol, Rosario Dawson as Ahsoka Tano, Katee Sackhoff as Bo-Katan Kryze, Temuera Morrison as Boba Fett, Timothy Olyphant as former slave Cobb Vanth, Michael Biehn as a rival bounty hunter, and Sasha Banks in an undisclosed role.

Jon Favreau (The Lion King, Iron Man) created The Mandalorian and serves as showrunner and head writer on the Star Wars series. Favreau and Weathers are among the directors on season 2 alongside Dave Filoni, Rick Famuyiwa, Bryce Dallas Howard, Peyton Reed, and Robert Rodriguez.

The Mandalorian season 2 premieres October 30 on Disney+ Hotstar in India. Episodes will air weekly.

Source link

0
Continue Reading

Technology

Wild West for developers when it comes to writing cloud-native apps

istock 1173805290

Commentary: Containers ate your infrastructure, but what comes next at the application layer? A new survey points to big, industry-wide decisions to be made about the tech used to write applications.

Image: vladans, Getty Images/iStockphoto

Twenty years ago it seemed certain that the underpinnings of future data center infrastructure would be Linux clusters running on x86 “commodity” hardware. We just didn’t know what to call it or where exactly it would run.

The big systems vendors like IBM, Sun, HP, and Cisco weren’t calling it “cloud”; instead, the vendors named it utility computing, autonomic computing, grid computing, on-demand, and n- other terms. At Comdex 2003, it was reported on ZDNet that “participants in a panel discussion at Comdex agree that utility computing is more like a river than a rock, but have little luck nailing down a real definition.” (ZDNet is a sister site of TechRepublic.)

Two decades later we know what to call it (“cloud”), and we know it’s built with containers and a whole lot of Linux. As detailed in the new Lightbend survey Cloud Native Adoption Trends 2020-2021, 75.2% of respondents already host the majority of their applications in some sort of cloud infrastructure, and roughly 60% run most of their new applications in Kubernetes/containers.

Now we’re faced with another major rethink that will affect tens of millions of developers operating at the application layer, where there are common threads on crucial concepts, but everyone is bringing different and predictions for the future.

SEE: Top 5 programming languages for systems admins to learn (free PDF) (TechRepublic)  

Higher up the stack

We’ve all felt this happening as containers have eaten into the virtual machine landscape, as web programming languages (JavaScript) surpass JVM/server-side languages (Java) in developer popularity, and as serverless, JAMstack, and other still-being-named phenomena change the “developer experience” in writing cloud-native applications. The diversity of choice in the “right way” to write software for the cloud has become a bit of a Wild West for developers.

As Google developer advocate Kelsey Hightower put it earlier this year, “There’s a ton of effort attempting to ‘modernize’ applications at the infrastructure layer, but without equal investment at the application layer, think frameworks and application servers, we’re only solving half the problem.”

“There’s a huge gap between the infrastructure and building a full application,” said Jonas Bonér, CTO and co-founder at Lightbend, in an interview. “It’s an exercise for the programmer to fill in this huge gap of what it actually means to provide SLAs to the business, all the things that are hard in distributed systems but needed for the application layer to make the most of Kubernetes and its system of tools.”

SEE: Top cloud trends for 2021: Forrester predicts spike in cloud-native tech, public cloud, and more (TechRepublic)

Lightbend’s cloud adoption report highlights some of these major decision points that remain murky for the application layer of the cloud-native stack.

“Building cloud-native applications means creating software that is designed with the advantages—and disadvantages—of the cloud in mind,” said Klint Finley, author of the Lightbend survey. “It means taking advantage of the fact that it’s possible to outsource entire categories of functionality—like databases and authentication—to public cloud services and planning for the fact that communication between those cloud components might be unreliable.”

Some developers still prefer frameworks they personally maintain and scale, while the business side clearly prefers frameworks that are delivered “as a service” via API, the survey says. Namely, 54.7% of managers said it was their highest priority to write business applications that specifically leverage the underlying cloud infrastructure vs. 38.3% of developers. Meanwhile, consuming back-end services by API rather than building and maintaining your own is the defining characteristic of the emergent JAMstack (JavaScript / API / Markup) architecture that has the weight of Facebooks’ React programming language momentum behind it. But it’s a completely different approach than the old-school server-side mindset for Java developers that still rule the roost and command vast legacy systems at most major enterprises.

The survey also suggests that developers think about cloud computing more in terms of specific technologies like Kubernetes and containers, while management thinks of cloud computing more as a new way to build applications. Management tends to prefer outsourcing as much maintenance as possible, while developers’ preference for configurability over automation reveals a desire not to lose too much control over the many layers of an application stack. As one respondent put it: “SaaS comes with ease of adoption and faster time to market, however many do not understand the cost of running them at scale.”

Disclosure: I work for AWS, but the views expressed herein are mine.

Also see



Source link

0
Continue Reading

Trending