New Articles

Quantum Computing Advantage: Today and Tomorrow

quantum computing

Quantum Computing Advantage: Today and Tomorrow

To date, the power of computing has enabled a remote economy, remote healthcare, remote collaboration, remote education, secure and contactless transactions, and intelligence that surpasses the human mind. New quantum computing power will usher in a brand new era — providing massive rewards to the companies and countries leading in the space, leaving laggards in the dustbin of history.

Paving a New Road Ahead

We didn’t need MIT to name quantum computing a breakout technology back in 2017 and again in 2020 to know quantum computing is paving a new road ahead. Recently, Google solved a problem in just over three minutes with a quantum computer that would have taken a supercomputer longer than 10,000 years to solve. While excellent news, not many understand what a quantum computer does, and many investors don’t know what quantum computing means for their portfolio. Still, the quantum computing opportunity has never been more relevant than it is today.

In the last few years, quantum computing has been making traction, with many companies building systems that aren’t powerful enough for most real-world use cases yet, but still, show promise. Tomer Diari, an investor from Bessemer Venture Partners, told TechCrunch, “Quantum computing will drive a paradigm shift in high-performance computers as we continue pushing the boundaries of science deeper into the realms of science fiction.”

The Leader in New Tech

In last year alone, several breakthroughs from research, venture-backed companies, and the tech industry have unlocked the challenges in scientific discovery. This has moved quantum computing from science fiction to reality and armed it to solve significant world problems.

Companies like Atom Computing leveraging neutral atoms for wireless qubit control, Honeywell’s trapped ions approach, and Google’s superconducting metals, have all seen first-time results, setting the stage for the first commercial generation of quantum computers.

At just 40-80 error-corrected qubit range, these systems could deliver capabilities that surpass classical computers, which will, in turn, speed up the ability to perform better in areas like thermodynamic predictions, chemical reactions, resource optimizations, and financial predictions. Companies like Microsoft, IBM, and Intel, and Google are further ahead than anyone else has been to unlock the quantum computing scope. As many technologies and ecosystem breakthroughs begin to converge, the next year will be a decisive moment.

Investors Are Spending

Recently, Quantum computing startup Rigetti Computing raised US$79 million in a Series C funding round, which was led by Bessemer Venture Partners and intended to advance its efforts in making quantum computing commercially viable, according to Business Times. EDBI, Singapore’s Economic Development Board, Franklin Templeton, Alumni Ventures Group, DCVC, Morpheus Ventures, and Northgate Capital, also participated in the round.

“This round of financing brings us one step closer to delivering quantum advantage to the market,” said Chad Rigetti, founder and CEO of Rigetti, a company that builds and delivers integrated quantum systems over the cloud and develops software solutions optimized for hybrid quantum-classical computing. Hybrid models like this one leverage quantum and classical computations – a more practical quantum computing approach.

Controversy

In a piece published in Science, researchers in China used quantum mechanics to perform computations in minutes. This would have taken billions of years using conventional machines. The research, which used photonic quantum computers, shows what claims to be the very first definitive demonstration using a “quantum advantage” to solving a problem that would have been impossible with classical computers.

However, as mentioned above, last year, Google built a quantum computer that they said achieved “quantum supremacy” and performed computations in minutes that would have taken the most powerful supercomputers tens of thousands of years. Google’s quantum computer was programmable. Google’s claim has been contested throughout the quantum computing field and many argued that a classical supercomputer could have performed the computations faster with a better algorithm. This back-and-forth and the fact that the area can’t agree on whether to call these achievements “quantum advantage” or “quantum supremacy” shows quantum computing is still a developing technology.

Looking Ahead

A quantum computer comprises qubits that can store an infinite number of values while still providing a single measure. Still, a regular computer can only store one value in one register, according to Forbes. Like A.I., the quantum world is entirely built on probabilities, which has led us to be engulfed in fascination with the possibilities and chances on the horizon. Both the hardware and algorithms have a long way to go until they grace our level of environments. It’s not an unattainable innovation, though – it’s reachable enough to learn and research for now.

Recent signs show that the lab’s progress is starting to transfer into commercial products, specifically in cloud computing. Xanadu announced a partnership with AWS to bring its open-source quantum software library PennyLane to the cloud computing giant. Additionally, IBM reached one of the most accepted general quantum computing performance measures on one of its systems.

____________________________________________________________________

Louis Lehot is a partner and business lawyer with Foley & Lardner LLP, based in the firm’s Silicon Valley, San Francisco and Los Angeles offices, where he is a member of the Private Equity & Venture Capital,  M&A and Transactions Practices and the Technology, Health Care, and Energy Industry Teams. Louis focuses his practice on advising entrepreneurs and their management teams, investors and financial advisors at all stages of growth, from the garage to global. Louis especially enjoys being able to help his clients achieve hyper-growth, go public and to successfully obtain optimal liquidity events. Prior to joining Foley, Louis was the founder of a Silicon Valley boutique law firm called L2 Counsel.

analytics centre

Why Organizations Today Need an Analytics Centre of Excellence

Today, anyone can easily get started with analytics. But without an internal center of excellence, it’s impossible to unlock their full value.

Today, analytics capabilities are easier to access and engage with than ever before. Virtually anyone can use streamlined cloud-based open-sourced analytics tools to start generating insights.

However, as many teams that have experimented with these tools will have experienced, when a single line of business team embarks on an analytics project without expert assistance and a joined-up approach, the results often fall very short of expectations.

The cornerstones of long-term analytics success

In practice, what you get out of an analytics project depends entirely on what you put in.

To get the best results, uncover the strongest actionable insights, and deliver the highest levels of value to the business, all analytics projects and initiatives need three key things:

#1) Well-maintained and clearly defined data

To be successful, analytics demands clean data from multiple sources – all clearly defined and regularly maintained by an expert data team. This should be data that’s already in the organization’s systems and in use, or from outside agencies, social media, etc., to ensure that analytics outputs are reflective of the true state of operations today and are capable of unleashing the potential.

#2) Automated, actionable outputs

Good analytics models deliver insights that people across teams can easily turn into meaningful actions immediately. These outputs should also be automated, so teams are repeatedly served with the insights they need, as often as they’re required, without further effort.

#3) A scalable foundation

Finally, analytics model ecosystems should be scalable. As the volumes and types of data that fuel them grow and change, the model should be able to grow and change alongside them – ensure it can continue to deliver value for teams long into the future. That also means being deployable at scale, so the solution can ultimately serve as many users across as many teams as required.

The power of an enterprise-wide view

To put those three things in place, you need to build an enterprise-wide view, and a team responsible for analytics delivery that connects:

Your business team: The people that shape and understand what an organization needs to get from analytics

Data and analytics experts: The people who understand data and how to model it, and ultimately hold the skills needed to create automated analytics models

IT professionals: The people with the skills to deploy analytics models as workable solutions, run them in a live ecosystem, and maintain the sources of analytical insights

Bringing those people together under one team is the foundation of an analytics Centre of Excellence (CoE). By combining the expertise and viewpoints of these groups, you can ensure all analytics projects:

-Are expertly created to deliver the highest-quality outputs

-Will continue to deliver long-term value through automated insight delivery

-Are aligned with the needs of the business as a whole – not just an individual team

The five reasons all organizations need an analytics CoE

The value of an analytics CoE goes far beyond just ensuring that analytics services, solutions and insights are created and deployed to a high standard. Here are five ways analytics CoEs add business value that organizations often overlook:

#1) Starting your journey with a clear roadmap

The adoption and use of analytics represents a significant transformation for most organizations. These models and the insights they generate can radically impact how a business operates, from the day-to-day process, right up to influencing its strategic direction. Transformation of that kind requires careful planning.

An analytics CoE team can carefully plan out an organization’s analytics transformation roadmap. That means acquiring and building the right capabilities and skills, and developing models and processes that will deliver long term value for everyone.

#2) Enabling continuous improvement and delivery

When individual teams are managing their own analytics processes, nobody else has any visibility of those projects, so they can’t learn from the successes and failures of that project.

With one team responsible for analytics delivery, every project brings new lessons learned that can be used to continuously improve the organization’s other analytics models. This supports a continuous delivery approach where the company’s analytics capabilities and insights keep getting stronger.

#3) Becoming future-ready

An analytics CoE can take a far more strategic approach to analytics projects than any individual line of business could. When building new models, they can look beyond the team’s immediate need, and ensure that what they create and deliver will meet tomorrow’s needs as well as today’s.

That could mean ensuring that a model will support unstructured data types like social media data for example, even if its first iteration will be fuelled purely by structured data. Or, the platform in use is scalable and can handle insight generation, on a real-time basis.

#4) Supporting data governance and compliance

Beyond simply creating models and delivering insights to the organization, the analytics CoE team can also become powerful custodians of data – performing essential maintenance tasks that keep data clean and well-structured.

Having a team directly responsible for this ensures that analytics output quality remains high. But, it also helps support data governance and compliance efforts. The CoE team can define, maintain and closely monitor access rights, and help ensure that data and insights are only accessible by the right people at the right time.

#5) Ensuring high adoption and engagement across the organization

To see value from analytics, your teams need to understand how to access and use them properly. The analytics CoE team plays an important role here, as both educators and advocates for analytics adoption.

Having the analytics CoE in place helps ensure high engagement with analytics, as teams across the business have a single defined place to direct their questions, and can access dedicated assistance whenever they need it.

Build your analytics CoE with The Smart Cube

When you have an analytics CoE, you have everything you need to ensure long-term analytics success in one place. You’ll immediately be able to better meet the demands of the entire business today, while simultaneously preparing for whatever new data or insight challenges tomorrow might bring.

Find out more about The Smart Cube’s Analytics Centre of Excellence solution, or contact us today.

manufacturing

Future-Proofing Your Manufacturing Supply Chain Now for 2021

During times of uncertainty, manufacturers must address current challenges while also planning for the unexpected. There’s no time like the present to kick off some of these initiatives.

A very famous American once said, “If you fail to plan, you are planning to fail.” Uttered by no other than Benjamin Franklin, these words have never been more applicable for companies that are both navigating the current COVID turmoil and preparing for what’s coming around the next corner.

And while it may be tempting to focus on the here-and-now during this unprecedented situation, organizations must also start preparing for recovery now or risk falling behind the curve for 2021 (and beyond).

Thinking Ahead

Disruptions, product shortages, and a new wave of shutdowns are taking a toll on global supply chains right at a time when the International Monetary Fund (IMF) released its rosier economic growth projections for 2021.

According to its World Economic Outlook, the global forecast calls for a 5.2% increase in 2021 (up from 4.4% decline in 2020). “The improved forecast reflects both better-than-expected second quarters,” the IMF states, “mostly for advanced economies – and indicators of strong recovery in the third quarter.”

Despite the current obstacles, this is no time to bury your head in the sand and hope that the problems go away. In fact, Tiffany Stovall of Kansas Manufacturing Solutions says manufacturers should be evaluating their current state and preparing to take paths based on their circumstances. “If the COVID-19 pandemic has taught manufacturers anything,” she writes in Industry Week, “it’s that they have to be nimble and that there will be additional opportunities.”

Pointing to a recent state of manufacturing survey that found that 89% of manufacturers were negatively impacted by the pandemic (e.g., revenue decreases, price increases, supply chain issues, etc.), Stovall says 97% of manufacturers found new opportunities during the pandemic.

“To take advantage of opportunities during times of uncertainty, manufacturers need to have a plan for the unexpected,” she writes. “Manufacturers would be wise to level set from reactive to proactive and develop a strategic roadmap of ‘adoptability.”

Planning Ahead

In A five-step plan for business stability after the pandemic, Dayton Business Journal urges companies to start identifying new opportunities, setting new goals, and making projections now, even though no one really knows yet what the “new normal” is going to look like after the COVID threat passes.

“People have to start asking themselves, ‘What’s on the other side?’” one finance leader told the publication. “Is it just surviving? If we continue to invest in this business, what must we change to be successful?” Good first steps in this direction include taking inventory of the last 12 months, shoring up their balance sheets, and planning for potential supply chain disruption.

“Businesses seeking to build resiliency also need to consider their customer and supplier concentrations and what could happen in the event of a disruption,” the Business Journal adds, “Identify demand-side risks like a reliance on a single customer or product line and plan accordingly.”

Digital Enablement Tools to the Rescue

Citing a new Alexander Group manufacturing survey, Industry Week says that while recent economic challenges have forced companies to reimagine their business models, the predicted “snap-back” in demand could leave some of them in a lurch in 2021.

“While most firms have recovered from the initial economic shock, they are learning to operate in an uncertain environment,” Alexander Group’s John Drosos writes. For help, many of these companies invested in digital enablement tools to enhance interactions with existing and new customers.

Using supply chain applications like warehouse management systems (WMS) and supply chain visibility platforms, for instance, companies can gain good visibility across their global networks while also managing the higher velocity of e-commerce orders that are now moving through their warehouses and distribution centers.

“Never in recent history have manufacturers transformed their organizations so quickly, and so decisively,” Drosos concludes. “The COVID-19 pandemic accelerated many trends that were already appearing within commercial models. While still living with great uncertainty, manufacturers are revising their strategic business models to reflect a new reality: the buyer journey has changed, and they are responding accordingly through broad-based commercial initiatives and investments.”

Direct-to-Consumer (DTC) Strategy on the Rise

2020 has seen several consumer trends emerge. Customers have gone to their favorite retailer asking about a particular product they have been buying at the store in the past, only to be told that the manufacturer is no longer distributing the product in stores. In fact, the only way to make that purchase now is to order directly from the manufacturer, usually through their own e-commerce platform.

Some of these manufacturers choose to go directly to the consumer for a portion of the sales, and some are completely bypassing the retailer. In any case, this emerging trend is a game-changer for both manufacturers and retailers. The manufacturer has to distribute their goods efficiently, perhaps, in a similar way an e-commerce retailer would. Over the last decade, we have seen retailers struggle to support omnichannel distribution, shipping goods to stores, and directly to the consumer from their distribution centers and stores. Today, manufacturers choosing the DTC strategy will require a vastly different distribution model.

Resilience through the pandemic

Regardless of the company or industry in question, following Franklin’s early planning advice is sure to pay off as 2021 approaches and as the business environment normalizes. “Manufacturers continue to show resilience through the pandemic. Many see reasons for optimism as they have found ways to keep their staff employed,” Stovall concludes. “Now is the time to look at how they can grow. From their current experiences, manufacturers should learn, plan, adjust, and evolve. In this environment, proactive manufacturers will have opportunities.”

Generix Group North America provides a series of solutions within our Supply Chain Hub product suite to create efficiencies across an entire supply chain. From Warehouse Management Systems (WMS) and Transportation Management Systems (TMS) to Manufacturing Execution Systems (MES) and more, software platforms can deliver a wide range of benefits that ultimately flow to the warehouse operator’s bottom line. Our solutions are in use around the world and our experience is second-to-none. We invite you to contact us to learn more.

This article originally appeared on GenerixGroup.com. Republished with permission.

data

Made Dizzy By COVID-19 Data? Artificial Intelligence Helps Clear Things Up.

As governors begin to make decisions about reopening the economy, Americans are left to wonder whether they should follow their state government’s lead – or make their own decisions about when to return to normal.

One problem for the average person: How to decipher the multitudes of data about COVID-19 and evaluate whether the country or any particular state is – or is not – flattening the curve.

“It’s easy to find tons of data online with charts and graphs, but all those numbers can be overwhelming,” says Sharon Daniels, chief executive officer of Arria (www.arria.com), which specializes in a form of artificial intelligence known as Natural Language Generation (NLG). “You see a line on a graph, but what is it telling you?”

Daniels’ company is among those trying to simplify that complex chore for Americans, using artificial intelligence to transform that raw data into an easy-to-understand narrative. To this end, Arria is involved with two online initiatives – the COVID-19 Live Report and the COVID-19 U.S. Tracking Report – that give Americans access to NLG as they try to grasp all the information coming their way from scientists, government officials, and the media.

Each of these free dashboards allows anyone – from government leaders to journalists to citizens – to review up-to-date COVID-19 data, along with critical insights transformed into writing by Arria’s Natural Language Generation software. The software uses language analytics and computational linguistics to “think” like a writer, pulling the most important information to the top of the narrative, providing critical insights, and giving meaning to the tabulated reports and visualizations.

Just as an example, a resident of Pulaski County, KY, who checked in on April 23 would have learned that in their community the previous day “there were 2 new cases and no deaths reported. During the past 7 days, cases have increased by 7, which means the seven-day rolling average for cases is 1.”

No human wrote those sentences. They were penned automatically by the NLG software.

As Arria and others do their part to help Americans work their way through the sea of information, there is evidence that such assistance is both needed and wanted:

Gallup poll shows lots of confusion about the state of the virus in the U.S., with Americans reaching no consensus on how they think things now stand; 41 percent say the situation is getting better, 39 percent say it is getting worse, and 20 percent say it is staying the same.

A 2017 study of the U.S. public’s understanding of the 2014 Ebola outbreak in West Africa found that most people are good at assessing risk when information is communicated accurately and effectively. That study also found that Americans want accurate and honest information, even if that information might worry people.

Knowing the facts is one way people can reduce their stress level during the pandemic, according to the Centers for Disease Control. “When you share accurate information about COVID-19,” the CDC reports, “you can help make people feel less stressed.”

“The sheer flood of data and information we are seeing daily about the pandemic is nearly impossible to process without the help of technology,” Daniels says. “People want information that will help them understand what’s happening, particularly in the areas where they live. But if that information is too confusing and complicated, they are going to remain confused and scared – wondering what to do, how to help, or how to keep their families safe.”

__________________________________________________________________

Sharon Daniels is the chief executive officer of Arria (www.arria.com), which specializes in a form of artificial intelligence known as Natural Language Generation. Daniels’ entrepreneurial career in building and expanding technology startups began in 1984 and has now spanned more than three decades of technology evolution. Her previous experience includes serving as founding executive director of Diligent Corp., a technology company that grew to become a member of the S&P/NZX 50 composite index before being acquired by Insight Venture Partners for $624 million.

healthcare

Healthcare Analytics Market Size to Cross $18,250.8 million by 2025

Emergence of Big Data in healthcare and its impact on healthcare analytics market growth and set to exceed USD 18,250.8 million by 2025

There has been a significant paradigm shift in recent years pertaining to the collection, storage, maintenance, management and analysis of data. Over the years, bolstered by the wave of digitalization sweeping across the globe, data collection is transitioning gradually from paper-based charts to digital real-time analytics systems. 

This new data ecosystem is designed not just to enhance disease prevention rates but also to improve medical diagnostics, administer medications securely and augment overall treatment processes. Consequently, healthcare offerings are evolving from a one-size-fits-all approach to more patient-centric, customized treatment plans. 

The rising prevalence of big data owing to digitization of analytics is promoting the adoption of electronic health records or EHRs for patient’s health data collection, which is likely to add great impetus to global healthcare analytics market expansion.

The evolution of the healthcare landscape is generating immense demand for advanced healthcare data analytics. 

Healthcare analytics or clinical data analytics entails the use of EHRs to garner actionable insights into patient’s health condition and develop suitable treatment plans.

As the number of patients seeking healthcare solutions continues to surge and resources continue to deplete, conventional claims-based analytics systems are unable to accommodate the rapidly arising healthcare issues. This combined with the immense capacity of clinical data in EHRs is a major driving force behind the popularity of advanced healthcare analytics.

Considering the prolific expansion of the healthcare industry, numerous industry players are making persistent efforts towards enhancing health data analytics systems to streamline patient care.

For instance, InterSystems has recently partnered with digital engineering expert Virtusa in a bid to advance healthcare data analytics capabilities in vLife, Virtusa’s cloud-based life sciences platform.

The system comprises a robust, HIPAA-compliant data repository with multiple sources of data. Additionally, the platform features pre-built APIs as well as AI and machine learning-based models.

Prediction to Prevention – The Significance of Predictive Analysis in Healthcare Applications

Predictive analytics is a sophisticated healthcare data analytics tool that leverages historical data and real-time information in order to forecast potential outcomes. Considering the healthcare landscape, predictive analytics can be applied to consumer, claims or patient data through which healthcare workers can predict patterns or trends which can help enhance patient care or outreach programs.

To illustrate, according to a 2017 study, the University of Pennsylvania utilized a predictive analytics tool integrated with machine learning and EHR information to detect severe sepsis or septic shock in on-track patients, nearly 12 hours before the condition manifested.

Burgeoning need for Real-Time Healthcare Solutions

Empowered by the digital revolution, a large number of healthcare management systems are now leveraging real-time, event-driven data feeds. 

Since healthcare is a real-time activity, the ability to accumulate health-related data in real-time is a great boon for healthcare workers and clinicians. It gives them the ability to make point-of-care lifesaving decisions and reduce the dependency on resources, thereby cutting back treatment costs to a significant extent.

Real-time analytics demonstrate great potential in various healthcare-related scenarios; for instance, in the case of a patient’s blood pressure signifying an alarming increase, the healthcare analytics will send a real-time report to the doctor, who can immediately act on administering suitable measures to counteract the condition.

Many prominent industry players are working towards adopting real-time monitoring into their product innovations. For example, Apple has just revealed a new Apple Watch Series 4 with an integrated EKG (electrocardiogram) features, which helps users track their cardiovascular information in real time and alert them to any undiagnosed conditions.

Healthcare Analytics Transformation through Big Data

The emergence of big data has brought about a tremendous shift in the way data is collected, analyzed and used in a plethora of industries. Big data comprises large quantities of data generated through digitization of myriad sources, which is then merged and analyzed by specialized technologies. When used in healthcare analytics, big data makes use of population or individual-specific health data which can potentially mitigate the risk of epidemic, treat maladies and reduce costs, among other benefits.

In light of the changing healthcare spectrum, more and more physicians are basing their decisions on ample quantities of clinical data instead of simply asserting their professional theoretical opinion. 

With the healthcare industry swelling and more and more data being collected, professionals need a support system to ensure proper management and application of data. This is perhaps why the demand for big data analytics across the healthcare sector is witnessing such tremendous growth.

The U.S healthcare analytics market is currently making immense strides in big data analytics, by ensuring the adoption of EHRs across nearly 94% hospitals. A major industry player in the U.S market is Kaiser Permanente, who has developed and integrated a novel system called Health Connect. This system allows data sharing across multiple facilities and streamlines the use of EHRs.  

Source: https://www.gminsights.com/industry-analysis/healthcare-analytics-market