Connect with us

When people say “cardiovascular disease” in the context of blood cholesterol, they mean atherosclerosis. This is the name given to the build up of fatty deposits that narrow and weaken blood vessels, leading to heart failure and ultimately some form of disabling or fatal rupture – a stroke or heart attack. The primary approach to treatment is the use of lifestyle choices and drugs such as statins to lower cholesterol carried by LDL particles in the blood. Unfortunately, the evidence strongly suggests that this is the wrong approach, in that the benefits are small and unreliable.

Reducing LDL Cholesterol is the Wrong Target for Cardiovascular Disease

Image credit: Pixabay (Free Pixabay license)

Atherosclerosis does occur more readily with very high levels of LDL cholesterol, as illustrated by the early onset of the condition in patients with genetic disorders such as homozygous familial hypercholesterolemia, in which blood cholesterol can be as high as ten times normal. Yet reducing LDL cholesterol levels, even to as much as ten times lower than normal, does very little for patients with established atherosclerotic lesions. One has to look at the mechanisms of the disease in more detail to (a) see why this is the case, and (b) identify which classes of therapy should be attempted instead.

Atherosclerosis is essentially a consequence of the failure of a process called reverse cholesterol transport. When cholesterol becomes stuck in excessive amounts in blood vessel walls, macrophage cells of the innate immune system are called to the site. The macrophages ingest cholesterol and then hand it off to HDL particles. The HDL cholesterol is then carried to the liver to be excreted. This all works just fine in young people. Older people, however, exhibit growing levels of oxidized cholesterols such as the toxic 7-ketocholesterol. Even small amounts of these oxidized cholesterols disrupt macrophage function in ways that are otherwise only achievable through very sizable amounts of cholesterol. The macrophages become inflammatory, cease their work, become loaded down with cholesterol, and die. An atherosclerotic lesion is essentially a self-sustaining macrophage graveyard that will keep pulling in and destroying ever more cells, growing larger as it does so.

The right point of intervention in atherosclerosis is therefore macrophage function. Make macrophages resistant to oxidative cholesterol and cholesterol overload, as Repair Biotechnologies is doing. Or remove oxidized cholesterols from the body, as Underdog Pharmaceuticals is doing. The crucial goal is to allow macrophages to operate normally in the toxic environment of the atherosclerotic lesion; given enough time, it is in principle possible for these cells to dismantle even advanced and sizable lesions. That they do not normally do this is because of oxidized cholesterols or sheer amount of cholesterol, not any other inherent limit.

Doubt cast on wisdom of targeting ‘bad’ cholesterol to curb heart disease risk

Setting targets for ‘bad’ (LDL) cholesterol levels to ward off heart disease and death in those at risk might seem intuitive, but decades of research have failed to show any consistent benefit for this approach, reveals a new analysis. If anything, it is failing to identify many of those at high risk while most likely including those at low risk, who don’t need treatment, say the researchers, who call into question the validity of this strategy.

Cholesterol-lowering drugs are now prescribed to millions of people around the world in line with clinical guidelines. Those with poor cardiovascular health; those with LDL cholesterol levels of 190 mg/dl or higher; adults with diabetes; and those whose estimated risk is 7.5% or more over the next 10 years, based on various contributory factors, such as age and family history, are all considered to be at moderate to high risk of future cardiovascular disease. But although lowering LDL cholesterol is an established part of preventive treatment, and backed up by a substantial body of evidence, the approach has never been properly validated, say the researchers.

Hit or miss: the new cholesterol targets

This analysis highlights the discordance between a well-researched clinical guideline written by experts and empirical evidence gleaned from dozens of clinical trials of cholesterol reduction. It further underscores the ongoing debate about lowering cholesterol in general and the use of statins in particular. In this analysis over three-quarters of the cholesterol lowering trials reported no mortality benefit and nearly half reported no cardiovascular benefit at all.

The widely held theory that there is a linear relationship between the degree of LDL-C reduction and the degree of cardiovascular risk reduction is undermined by the fact that some randomized controlled trials with very modest reductions of LDL-C reported cardiovascular benefits while others with much greater degrees of LDL-C reduction did not. This lack of exposure-response relationship suggests there is no correlation between the percent reduction in LDL-C and the absolute risk reduction in cardiovascular events.

Moreover, consider that the Minnesota Coronary Experiment, a 4-year long randomized controlled trial of a low-fat diet involving 9423 subjects, actually reported an increase in mortality and cardiovascular events despite a 13% reduction in total cholesterol. What is clear is the lack of clarity of these issues. In most fields of science the existence of contradictory evidence usually leads to a paradigm shift or modification of the theory in question, but in this case the contradictory evidence has been largely ignored simply because it doesn’t fit the prevailing paradigm.

Source: Fight Aging!




Source link

0
Continue Reading

Technology

AWS re:Invent Day 1: Top 5 announcements include machine learning, storage innovations, and container capabilities

reinvent

CEO Andy Jassy covered 24 new product announcements in his three-hour keynote on the first day of the virtual Amazon Web Services event.

Aurora is the fastest growing service in the history of AWS, CEO Andy Jassy said in announcing the next version of Amazon Aurora Serverless at AWS re:Invent.

The newest products and services from Amazon Web Services (AWS) extend cloud tools and services to on-prem installations, reduce the cost of machine learning operations, and create more storage options. AWS CEO Andy Jassy announced 24 new capabilities during the keynote at the first day of AWS re:Invent which is a virtual event this year. The news covered everything from customer service platforms and computer vision algorithms to machine learning operations and serverless deployments.

Jassy said the company has been focused on listening to customers and inventing more options for instances, containers, and serverless deployments. Reducing costs is always a priority as well, he said, so AWS engineers looked for ways to make common operations more cost efficient.

Here is a look at all the product news AWS announced today.

More support for machine learning

Jassy said that machine learning has been growing rapidly, and a lot of the costs are related to inference, not necessarily training the data initially. He said that cloud providers have not focused on reducing those costs. 

“Alexa has reduced their cost of inference by 30% and lowered their latency by 25% using our Inf1 instances,” he said.

SEE: Cloud data storage policy (TechRepublic Premium)

AWS is also working on reducing costs for training models with Habana Gaudi-based Amazon EC2 instances that will be available in 2021. AWS Trainium is another option for reducing costs and will be available in the second half of 2021. 

The company also announced five new industrial machine learning services:

  • Monitron for end-to-end machine monitoring that includes sensors, gateway, and machine learning service to detect abnormal equipment conditions that may require maintenance
  • Lookout for Equipment to give customers with existing equipment sensors the ability to use AWS machine learning models to detect malfunctions
  • AWS Panorama Appliance for customers with existing cameras in industrial facilities to use computer vision to improve quality control and workplace safety
  • AWS Panorama Software Development Kit (SDK) to allow industrial camera manufacturers to embed computer vision capabilities in new cameras
  • Lookout for Vision to use AWS-trained computer vision models on images and video streams to find flaws in products or processes

Axis, ADLINK Technology, BP, Deloitte, Fender, GE Healthcare, and Siemens Mobility are using these new machine learning services, according to a press release.

Bringing cloud container capabilities on prem

Jassy claimed that two thirds of the containers in the cloud run on AWS. AWS has three offerings for containers: Elastic Kubernetes Service, Elastic Container Service, and Fargate.

He said all three offerings continue to grow like weeds, and that many customers use all three services to accommodate a particular team or use case.

Jassy said customers have requested options to manage containers on premises as they make the transition to the cloud. The four new container capabilities announced today extend the same cloud tools to on-prem containers:

  • ECS Anywhere enables customers to run Amazon Elastic Compute Services in their own data centers
  • Amazon EKS Anywhere provides the ability to run Amazon Elastic Kubernetes Services in their own data centers
  • AWS Proton provides a new service to automate container and serverless application development and deployment
  • Amazon Elastic Container Registry Public provides developers a way to share and deploy container software publicly

New storage innovations

In his keynote presentation, Jassy emphasized the need to constantly reinvent products and services and to build what customers want. He said Amazon customers have been asking for more options for global content distribution, storage compliance, and data-sharing. These four storage innovations announced Tuesday meet those needs: 

  • Amazon EBS io2 Block Express volumes: A storage area network (SAN) built for the cloud, with up to 256,000 IOPS, 4,000 MB/second throughput, and 64 TB of capacity
  • Next-generation Amazon EBS Gp3 volumes: A new iteration that gives customers the ability to provision additional IOPS and throughput performance independent of storage capacity and is priced 20% lower per GB than previous generation volumes
  • Amazon S3 Intelligent-Tiering: Now includes S3 Glacier Archive and Deep Archive access to existing Frequent and Infrequent access tiers to automatically reduce storage costs for objects that are rarely accessed
  • Amazon S3 Replication (multi-destination): The ability to replicate data to multiple S3 buckets simultaneously in the same AWS Region or any number of AWS Regions 

Supporting a move to serverless installations

Aurora is the fastest growing service in the history of AWS, Jassy said in announcing the next version of Amazon Aurora Serverless

SEE: Top cloud providers in 2020: AWS, Microsoft Azure, and Google Cloud, hybrid, SaaS players (TechRepublic)

This new capability is designed in part to make it easier to migrate from SQL Server to Amazon Aurora. This new iteration includes these features:

  • Aurora Serverless v2 to accommodate hundreds of thousands of transactions in a fraction of a second, delivering up to 90% cost savings compared with provisioning for peak capacity
  • Babelfish for Aurora PostgreSQL to provide the ability to run SQL Server applications directly on Amazon Aurora PostgreSQL with little to no code changes 
  • Open source Babelfish for PostgreSQL to extend the benefits of the Babelfish for Amazon Aurora PostgreSQL translation layer which will be coming in 2021 under the permissive Apache 2.0 license in GitHub

Expanding analytics capabilities

Rahul Pathak, vice president of analytics at AWS, said in a press release that these new services represent an order-of-magnitude performance improvement for Amazon Redshift. These new services make it easier for customers to move data between data stores and to ask natural language questions in their business dashboards and receive answers in seconds, according to the company. The new options include:

  • Advanced Query Accelerator for Amazon Redshift provides a new hardware-accelerated cache to improve query performance  
  • AWS Glue Elastic Views lets developers build materialized views that automatically combine and replicate data across multiple data stores
  • Amazon QuickSight Q is a machine learning-powered capability lets users type questions about their business data in natural language and receive highly accurate answers in seconds

Five new capabilities for Amazon Connect 

Amazon Connect is the company’s customer service-in-a-box offering. John Hancock, Capital One, Intuit, Best Western, Fujitsu, GE Appliances, and Square use the service to make customer service more efficient, according to AWS. The new features announced today include

  • Connect Wisdom provides contact center agents with real-time information 
  • Connect Customer Profiles gives agents a unified profile of each customer  
  • Real-Time Contact Lens offers a new option for contact center managers to use during a call
  • Connect Tasks automates, tracks, and manages tasks for contact center agents
  • Connect Voice ID delivers real-time caller authentication using machine learning-powered voice analysis

Also see

Source link

0
Continue Reading

Technology

Skoltech scientists run a ‘speed test’ to boost the production of carbon nanotubes

2

Skoltech researchers have investigated the procedure for catalyst delivery used in the most common method of carbon nanotube production, chemical vapor deposition (CVD), offering what they call a “simple and elegant” way to boost productivity and pave the way for cheaper and more accessible nanotube-based technology. The paper was published in the Chemical Engineering Journal.

Skoltech scientists run a ‘speed test to boost the production

Image credit: Skoltech/Pavel Odinev

Single-walled carbon nanotubes (SWCNT), tiny rolled sheets of graphene with a thickness of just one atom, hold huge promise when it comes to applications in materials science and electronics. That is the reason why so much effort is focused on perfecting the synthesis of SWCNTs; from physical methods, such as using laser beams to ablate a graphite target, all the way to the most common CVD approach, when metal catalyst particles are used to “strip” a carbon-containing gas of its carbon and grow the nanotubes on these particles.

“The road from raw materials to carbon nanotubes requires a fine balance between dozens of reactor parameters. The formation of carbon nanotubes is a tricky and complex process that has been studied for a long time, but still keeps many secrets,” explains Albert Nasibulin, a professor at Skoltech and an adjunct professor at the Department of Chemistry and Materials Science, Aalto University School of Chemical Engineering.

Various ways of enhancing catalyst activation, in order to produce more SWCNTs with the required properties, have already been suggested. Nasibulin and his colleagues focused on the injection procedure, namely on how to distribute ferrocene vapor (a commonly used catalyst precursor) within the reactor.

They grew their carbon nanotubes using the aerosol CVD approach, using carbon monoxide as a source of carbon, and monitored the synthesis productivity and SWCNT characteristics (such as their diameter) depending on the rate of catalyst injection and the concentration of CO2 (used as an agent for fine-tuning). Ultimately the researchers concluded that “injector flow rate adjustment could lead to a 9-fold increase in the synthesis productivity while preserving most of the SWCNT characteristics”, such as their diameter, the share of defective nanotubes, and film conductivity.

“Every technology is always about efficiency. When it comes to CVD production of nanotubes, the efficiency of the catalyst is usually out of sight. However, we see a great opportunity there and this work is only a first step towards an efficient technology,” Dmitry Krasnikov, a senior research scientist at Skoltech and co-author of the paper, says.

Source: Skoltech




Source link

0
Continue Reading

Technology

Instagram Is Featuring a Giving Tuesday Shared Story to Highlight Accounts That Are Donating

instagram giving tuesday 1606818177431

Instagram is featuring a ‘Giving Tuesday’ story at the front of the stories section on the app today. It will highlight accounts you follow that have used the ‘I Donated’ or ‘Donation’ stickers. This is part of Instagram’s plan to encourage users to donate during the charitable season that lasts till Giving Tuesday. On its main and creators handles, the Facebook-owned company is also highlighting creators who are helping to make a difference in their communities.

Soon, Instagram will be testing a more permanent way for users to create and share fundraisers for non-profits within their feed. The company said in a blog post that with this new upcoming feature, users will be able to link to any eligible non-profit directly within their posts.

The “I Donated” or the “Donation” stickers, meanwhile, will be added to a shared story for a limited time. On using the donation sticker, your story will be added to the shared Giving Tuesday story on Instagram.

Besides spotlighting creators who are helping to make a difference in their communities on Instagram’s main handle and creators handle, the company is also offering tips on how you can give back — such as volunteering, making donations or simply making a kind gesture to someone in need.

The Facebook owned company noted some other ways users can show love to small businesses on the app, such as using “Support Small Businesses” or “Buy Black” stickers on Stories to highlight businesses to their followers. Other ways to encourage small businesses mentioned by Instagram include ordering directly from restaurants via Instagram and the @shop account which showcases people behind businesses and lets users shop directly.

Meanwhile, Facebook has also introduced new features to give back during the season. The tech giant has pledged to match up to $7 million in eligible donations to US non-profits made on Facebook on Giving Tuesday, (today). Drives was also launched on Facebook in the US — a Community Help feature that makes it easier to collect food, clothes, and other necessities for those in need.


iPhone 12 Pro Series Is Amazing, but Why Is It So Expensive in India? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, download the episode, or just hit the play button below.



Source link

0
Continue Reading

Trending