Connect with us

Remote workers are happy at home, but many still feel pressured to work longer hours over management fears of lost productivity. Here’s how organizations can change to suit the new normal.

Image: LightFieldStudios, Getty Images/iStockphoto

Survey results from software recommendation site GetApp find that workers are ready for the new normal of remote work, but business leaders have yet to evolve past suspicions that remote workers are slacking off. COVID-19 has changed the face of work across the planet: Offices are closed, professionals of all kinds are working from home, and remote collaboration is the new normal. According to GetApp’s survey results, workers are completely fine with that.

What’s Hot at TechRepublic

Fifty percent of respondents said they want the option to stay remote permanently, and job satisfaction levels are incredibly high: 90% said they’re happy with their individual quality of work, 84% like their new work/life balance, 83% are happy with their individual productivity, and 79% are happy with their team’s productivity. By contrast, only 16% want to be back in the office with social distancing, and only 17% said they’re more productive in the office. 

SEE: COVID-19 workplace policy (TechRepublic Premium)

In short, no one in the rank-and-file is mourning the death of office culture. Business leaders, on the other hand, aren’t so sure.

Forty-four percent of SMB leaders, the survey found, want to let their employees work remotely some of the time, but 39% continue to believe that remote workers are less productive. “The data just isn’t there to support this impression of reduced productivity while working from home, and this myth is causing employee burnout. The myth is so powerful, employees pressure themselves to work longer hours even if they’re not feeling direct or indirect pressure from their managers to do so,” GetApp said. 

Despite employees being happy at home, the myth of productivity loss is causing 82% of workers to put in extra hours, work on the weekends, take calls before and after business hours, and answer emails on days off. The result of this pressure may be an increase in quantity of work, GetApp said, but not quality. “With 90% of employees feeling satisfied with the quality of their work, the pressure to work more hours isn’t going to move the needle the four or five percent you may think it will—instead you’ll end up with burned-out employees whose productivity is now actually suffering,” GetApp senior content analyst Olivia Montgomery said in a blog post about the survey results.

“As an employer, you need to embrace the remote work culture and help employees through the challenges they’re experiencing now, instead of ignoring them and thinking we’re all going back to the office soon,” Montgomery said.

SEE: Big data’s role in COVID-19 (free PDF) (TechRepublic)

GetApp has several recommendations for business leaders that want to learn to manage workers in the world of remote work. Trusting them and empowering them to meet their objectives is the key, and leaders can do that by: 

  • Taking a look at the software people need for work, and re-evaluating whether or not it’s the best option for remote work.
  • Moving to a smaller workspace, which can not only save money, but still allows employees who want (or need) to be in the office to come in once coronavirus concerns are passed.
  • Converting existing office spaces into collaboration hubs where teams that need to meet can come together for brainstorming sessions, meetings, etc. 
  • Encouraging asynchronous communication so employees don’t feel pressured to respond in real time.
  • Providing a device stipend. Using personal devices isn’t only a security risk, it’s also a source of worry for 32% of respondents.
  • Implementing a BYOD policy, even if the device being used isn’t physically coming into the office.

“Employees want to stay remote because they actually like most aspects of it, but as an employer, you need to change your perspective and embrace this new way of working,” Montgomery said. 

Also see

Source link

0
Continue Reading

Technology

Preserving privacy of machine learning models

Screen Shot 2020 10 20 at 9.46.39 AM

When you see headlines about artificial intelligence (AI) being used to detect health issues, that’s usually thanks to a hospital providing data to researchers. But such systems aren’t as robust as they could be, because such data is usually only taken from one organization.

Hospitals are understandably cautious about sharing data in a way that could get it leaked to competitors. Existing efforts to handle this issue include “federated learning” (FL) a technique that enables distributed clients to collaboratively learn a shared machine learning model while keeping their training data localized.

Preserving privacy of machine learning models

However, even the most cutting-edge FL methods have privacy concerns, since it’s possible to leak information about datasets using the trained model’s parameters or weights. Guaranteeing privacy in these circumstances generally requires skilled programmers to take significant time to tweak parameters – which isn’t practical for most organizations.

A team from MIT CSAIL thinks that medical organizations and others would benefit from their new system PrivacyFL, which serves as a real-world simulator for secure, privacy-preserving FL. Its key features include latency simulation, robustness to client departure, support for both centralized and decentralized learning, and configurable privacy and security mechanisms based on differential privacy and secure multiparty computation.

MIT principal research scientist Lalana Kagal says that simulators are essential for federated learning environments for several reasons.

  1. To evaluate accuracy. SKagal says such a system “should be able to simulate federated models and compare their accuracy with local models.”
  2. To evaluate total time taken. Communication between distant clients can become expensive. Simulations are useful for evaluating if client-client and client-server communications are beneficial.
  3. To evaluate approximate bounds on convergence and time taken for convergence.
  4. To simulate real-time dropouts. With PrivacyFL clients may drop out at any time.

Using the lessons learned with this simulator, the team we are in the process of developing an end-to-end federated learning system that can be used in real-world scenarios, For example, such a system could be used by collaborating hospitals to train privacy-preserving robust models to predict complex diseases.

Written by Adam Conner-Simons, MIT CSAIL

Source: Massachusetts Institute of Technology




Source link

0
Continue Reading

Technology

Oppo A33 (2020) With Triple Rear Cameras, 5,000mAh Battery Launched in India: Price, Specifications

oppo a33 2020 1603272528571

Oppo A33 (2020) has been launched in India, featuring a large 5,000mAh battery, a a Qualcomm Snapdragon 460 SoC, and a hole-punch display with a 90Hz refresh rate. The smartphone also has a rear fingerprint scanner and a triple camera setup at the back. The Oppo A33 (2020) was unveiled in Indonesia last month, and will go on sale via Flipkart later this month though it is already available via offline retail stores, the company announced.

Oppo A33 (2020) price in India, launch offers (expected)

The Oppo A33 (2020) has been priced at Rs. 11,990 for its 3GB RAM + 32GB storage option. Oppo says it is available via offline retail stores, and will go on sale from Flipkart in its “next Big Billion Day sale.” Offers include 5 percent cashback on Kotak Bank, RBL Bank, Bank of Baroda, and Federal Bank cards. If users buy the phone from Paytm, benefits worth Rs. 40,000 will be listed. Offline, there are also going to be schemes options from banks like Bajaj Finserv, Home Credit, HDB Financial Services, IDFC First Bank, HDFC Bank, and ICICI Bank. The Oppo A33 (2020) was launched in Indonesia in September,

Oppo A33 (2020) specifications

The Oppo A33 (2020) runs on ColorOS 7.2 based on Android 10 and features a 6.5-inch HD+ (720×1,600 pixels) hole-punch display with 90Hz refresh rate. Under the hood, there is the octa-core Qualcomm Snapdragon 460 SoC. Internal storage is at 32GB with the option to expand further using a microSD card (up to 256GB).

The Oppo A33 (2020) smartphone also has the triple rear camera setup that includes a 13-megapixel primary sensor. The camera setup also has a 2-megapixel depth sensor and a 2-megapixel macro shooter. The Oppo A33 has an 8-megapixel selfie camera.

There is a 5,000mAh battery with 18W fast charging support on the Oppo A33 (2020). There is also the fingerprint sensor at the back of the handset. The phone also comes with dual stereo speakers. Connectivity options include Bluetooth v5, USB Type-C port, Wi-Fi 802.11ac, and more.


Should the government explain why Chinese apps were banned? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, download the episode, or just hit the play button below.

Affiliate links may be automatically generated – see our ethics statement for details.

Source link

0
Continue Reading

Technology

Big data and DevOps: No longer separate silos, and that’s a good thing

istock 1209173008

The pandemic has caused major shifts in the way IT and big data work. Now they may be working together for better outcomes.

Image: iStock/ RRice1981

The world has changed a lot since March 2020, and the coronavirus pandemic has affected nearly every aspect of our lives. While we’ve seen massive changes in technology already, another change happening right now is in big data and its role with DevOps.

“The COVID-19 pandemic has accelerated the blending of data analytics and DevOps, meaning developers, data scientists, and product managers will need to work more closely together than ever before,” said Bill Detwiler, editor in chief of TechRepublic. 

SEE: TechRepublic Premium editorial calendar: IT policies, checklists, toolkits, and research for download  (TechRepublic Premium)

Detwiler was interviewing managers at Tibco, a leader in big data integration and analytics. They said the coronavirus pandemic had caused organizations to rethink how they were using big data and analytics, generating what appears to be a movement toward merging IT DevOps methodologies with big data analytics.

For IT organizations, this is more than just a story about how the pandemic has altered how companies think about big data and analytics. The emergency of COVID has placed emphasis on getting analytics insights and results to market quickly. This has redefined analytics reporting as mission-critical, and not just as an ancillary tool for how companies operate and strategize.

SEE: Return to work: What the new normal will look like post-pandemic (free PDF) (TechRepublic)

The change is also creating revisions in operations and culture for IT. Here are some we’ve seen.

A move from waterfall to DevOps development

Developing, testing, and deploying big data applications is an iterative process. Because the process is iterative (i.e., develop-test-deploy until you get what you want), it doesn’t follow the more linear and assembly line-like development methodology of traditional IT waterfall development, which is a serial sequence of handoffs from development to QA (test) to an implementation staff.

SEE: Are you a big data laggard? Here’s how to catch up (TechRepublic)

A majority of IT departments are still organized around the waterfall development paradigm. There are separate silos within IT for development, testing, and deployment. These functions have to come together with each other and end users in the more collaborative and iterative process of big data application development. To do this, functional silos of expertise have to dissipate. 

Culturally (and perhaps organizationally) this changes the orientation of IT. The culture shift is likely to entail the creation of interdisciplinary functional teams instead of work handoffs from functional silo to functional silo. End users also become active participants on these interdisciplinary teams.

Fewer absolutes for quality

The testing of big data applications becomes more relative and less absolute. This is a tough adjustment for IT because in traditional transaction systems, you either correctly move a data field from one place to another, or you obtain a value based on data and logic that absolutely conforms to what the test script dictates. If you don’t attain absolute conformance, you retest until you do. 

SEE: Big data: How wide should your lens be? It depends on your use (TechRepublic)

Not so much with big data, which could start off with results being only 80% accurate, but with the business deeming them close enough to indicate an actionable trend.

Working in a context where less-than-perfect precision is acceptable is a challenging adjustment for IT pros, who are used to seeing an entire system blow up if a single character in a program or script is miskeyed.

The shift of big data into mission-critical systems

If you’re a transportation company, the ability to track your loads on the road and the health and safety of the cargo that they’re carrying becomes mission-critical. If you’re in the armed forces and you’re using drones on the battlefield to conduct and report reconnaissance in real-time flyovers, the data becomes mission-critical.

SEE: Big data success: Why desktop integration is key (TechRepublic)

This means that organizations must begin to attach the label of mission-critical to big data and analytics applications that formerly were classified as experimental. 

IT culture must shift to support mission-critical big data applications for failover, priority maintenance, and continuous development. This could shift IT personnel from traditional transaction support to big data support, requiring retraining to facilitate the change.

Also see

Source link

0
Continue Reading

Trending