Article | August 13, 2020
The coronavirus outbreak in China has grown to a pandemic and is affecting the global health & social and economic dynamics. An ever increasing velocity and scale of analysis — in terms of both processing and access is required to succeed in the face of unimaginable shifts of market; health and social paradigms. The COVID-19 pandemic is accompanied by an Infodemic. With the global Novel Coronavirus pandemic filling headlines, TV news space and social media it can seem as if we are drowning in information and data about the virus. With so much data being pushed at us and shared it can be hard for the general public to know what is correct, what is useful and (unfortunately) what is dangerous. In general, levels of trust in scientists are quite high albeit with differences across countries and regions. A 2019 survey conducted across 140 countries showed that, globally, 72% of the respondents trusted scientists at “high” or “medium” levels. However, the proportion expressing “high” or “medium” levels of trust in science ranged from about 90% in Northern and Western Europe to 68% in South America and 48% in Central Africa (Rabesandratana, 2020).
In times of crisis, like the ongoing spread of COVID-19, both scientific & non-scientific data should be a trusted source for information, analysis and decision making. While global sharing and collaboration of research data has reached unprecedented levels, challenges remain. Trust in at least some of the data is relatively low, and outstanding issues include the lack of specific standards, co-ordination and interoperability, as well as data quality and interpretation. To strengthen the contribution of open science to the COVID-19 response, policy makers need to ensure adequate data governance models, interoperable standards, sustainable data sharing agreements involving public sector, private sector and civil society, incentives for researchers, sustainable infrastructures, human and institutional capabilities and mechanisms for access to data across borders.
The COVID19 data is cited critical for vaccine discovery; planning and forecasting for healthcare set up; emergency systems set up and expected to contribute to policy objectives like higher transparency and accountability, more informed policy debates, better public services, greater citizen engagement, and new business development. This is precisely why the need to have “open data” access to COVID-19 information is critical for humanity to succeed. In global emergencies like the coronavirus (COVID-19) pandemic, open science policies can remove obstacles to the free flow of research data and ideas, and thus accelerate the pace of research critical to combating the disease. UNESCO have set up open access to few data is leading a major role in this direction. Thankfully though, scientists around the world working on COVID-19 are able to work together, share data and findings and hopefully make a difference to the containment, treatment and eventually vaccines for COVID-19.
Science and technology are essential to humanity’s collective response to the COVID-19 pandemic. Yet the extent to which policymaking is shaped by scientific evidence and by technological possibilities varies across governments and societies, and can often be limited. At the same time, collaborations across science and technology communities have grown in response to the current crisis, holding promise for enhanced cooperation in the future as well.
A prominent example of this is the Coalition for Epidemic Preparedness Innovations (CEPI), launched in 2017 as a partnership between public, private, philanthropic and civil society organizations to accelerate the development of epidemic vaccines. Its ongoing work has cut the expected development time for a COVID-19 vaccine to 12–18 months, and its grants are providing quick funding for some promising early candidates. It is estimated that an investment of USD 2 billion will be needed, with resources being made available from a variety of sources (Yamey, et al., 2020).
The Open COVID Pledge was launched in April 2020 by an international coalition of scientists, lawyers, and technology companies, and calls on authors to make all intellectual property (IP) under their control available, free of charge, and without encumbrances to help end the COVID-19 pandemic, and reduce the impact of the disease. Some notable signatories include Intel, Facebook, Amazon, IBM, Sandia National Laboratories, Hewlett Packard, Microsoft, Uber, Open Knowledge Foundation, the Massachusetts Institute of Technology, and AT&T. The signatories will offer a specific non-exclusive royalty-free Open COVID license to use IP for the purpose of diagnosing, preventing and treating COVID-19.
Also illustrating the power of open science, online platforms are increasingly facilitating collaborative work of COVID-19 researchers around the world. A few examples include:
1. Research on treatments and vaccines is supported by Elixir, REACTing, CEPI and others.
2. WHO funded research and data organization.
3. London School of Hygiene and Tropical Medicine releases a dataset about the environments that have led to significant clusters of COVID-19 cases,containing more than 250 records with date, location, if the event was indoors or outdoors, and how many individuals became infected. (7/24/20)
4. The European Union Science Hub publishes a report on the concept of data-driven Mobility Functional Areas (MFAs). They demonstrate how mobile data calculated at a European regional scale can be useful for informing policies related to COVID-19 and future outbreaks. (7/16/20)
While clinical, epidemiological and laboratory data about COVID-19 is widely available, including genomic sequencing of the pathogen, a number of challenges remain:
1. All data is not sufficiently findable, accessible, interoperable and reusable (FAIR), or not yet FAIR data.
2. Sources of data tend to be dispersed, even though many pooling initiatives are under way, curation needs to be operated “on the fly”.
3. In addition, many issues arise around the interpretation of data – this can be illustrated by the widely followed epidemiological statistics. Typically, the statistics concern “confirmed cases”, “deaths” and “recoveries”. Each of these items seem to be treated differently in different countries, and are sometimes subject to methodological changes within the same country.
4. Specific standards for COVID-19 data therefore need to be established, and this is one of the priorities of the UK COVID-19 Strategy. A working group within Research Data Alliance has been set up to propose such standards at an international level.
Given the achievements and challenges of open science in the current crisis, lessons from prior experience & from SARS and MARS outbreaks globally can be drawn to assist the design of open science initiatives to address the COVID-19 crisis. The following actions can help to further strengthen open science in support of responses to the COVID-19 crisis:
1. Providing regulatory frameworks that would enable interoperability within the networks of large electronic health records providers, patient mediated exchanges, and peer-to-peer direct exchanges. Data standards need to ensure that data is findable, accessible, interoperable and reusable, including general data standards, as well as specific standards for the pandemic.
2. Working together by public actors, private actors, and civil society to develop and/or clarify a governance framework for the trusted reuse of privately-held research data toward the public interest. This framework should include governance principles, open data policies, trusted data reuse agreements, transparency requirements and safeguards, and accountability mechanisms, including ethical councils, that clearly define duties of care for data accessed in emergency contexts.
3. Securing adequate infrastructure (including data and software repositories, computational infrastructure, and digital collaboration platforms) to allow for recurrent occurrences of emergency situations. This includes a global network of certified trustworthy and interlinked repositories with compatible standards to guarantee the long-term preservation of FAIR COVID-19 data, as well as the preparedness for any future emergencies.
4. Ensuring that adequate human capital and institutional capabilities are in place to manage, create, curate and reuse research data – both in individual institutions and in institutions that act as data aggregators, whose role is real-time curation of data from different sources.
In increasingly knowledge-based societies and economies, data are a key resource. Enhanced access to publicly funded data enables research and innovation, and has far-reaching effects on resource efficiency, productivity and competitiveness, creating benefits for society at large. Yet these benefits must also be balanced against associated risks to privacy, intellectual property, national security and the public interest.
Entities such as UNESCO are helping the open science movement to progress towards establishing norms and standards that will facilitate greater, and more timely, access to scientific research across the world. Independent scientific assessments that inform the work of many United Nations bodies are indicating areas needing urgent action, and international cooperation can help with national capacities to implement them. At the same time, actively engaging with different stakeholders in countries around the dissemination of the findings of such assessments can help in building public trust in science.
Article | January 20, 2021
When you work with data, the most prominent challenge is picking the right manner to represent data in a readable format. With proper visualization, it becomes easier to convey the message present in the analyzed data. The best choice, in this case, is to use Data Visualization.
By definition, data visualization is the graphical representation of the data and information. Using elements like maps, charts, graphs, etc. gives an easier view to understanding the patterns, trends, outliers, and performance.
Combination Chart or C3 is based on D3 and provides reusable chart libraries for applications. C3 offers a class to each element to help in defining a custom style that can later extend structure from D3. One of the major benefits of using C3js is that you can even update the rendered chart.
Similar to C3, ReCharts also uses D3 while expressing declarative components. Being lightweight and rendering on SVG elements, it helps in creating stunning charts. The library consists of some beautiful chart examples. Moreover, the charts here can also be customized according to needs. It is great for static charts but can lag with multiple animations. With an intuitive API, it becomes highly powerful and responsive.
Based on SVG JS library charts, Highcharts is quite popular among large organizations. Highcharts comes with the entire ecosystem for various project templates. It is even compatible with old browsers. Even the non-developers can use it with ease because of its interactive chart editor. It is being used by popular brands like Microsoft.
Article | June 28, 2021
If you're my age, you will remember the critical premise of the 1992 classic "Sneakers" premise, starring Robert Redford and Ben Kingsley - a top-secret black box that can break the encryption of any computer system. Quantum computing is that "black box." In the next 2-7 years, quantum computers could change the face of cybersecurity. Once they can factor products of large prime numbers (the basis of current cryptography) (expected between 2024 and 2030) – existing cyber-defense mechanisms will be rendered obsolete. We need to plan for encryption in the quantum future.
What is Quantum Computing?
Classical computers use binary arithmetic - all numbers are a sequence of bits - either a 1 or a 0. However, a quantum bit (qubit) exists not as a 0 or 1 but as a superposition of the two (think Schrödinger's cat). Every additional qubit doubles the processing power of a quantum computer, allowing it to execute multiple computational paths simultaneously. Similarly, as per Grover’s algorithm, it is a known fact that quantum computing divides the key space of symmetric cryptography algorithms by two, meaning that their key sizes have to be doubled to keep the safety margin of today.
In October 2019, Google demonstrated quantum supremacy with Sycamore. It performed a series of operations in 200 seconds that Google claimed would take a supercomputer about 10,000 years to complete. In December 2020, physicists from the University of Science and Technology of China in Shanghai performed a Gaussian boson sampling technique with their photon-based quantum computer, named Jiuzhang. They declared that Sunway TaihuLight (the fourth fastest supercomputer in the world) would require 2.5 billion years (approx. half the age of the Earth) to finish the computations done by their quantum computer in a mere 200 seconds.
Cryptography: The gatekeepers of security
As the wise Spider-Man said – "With great power comes great responsibility. And great risk.” Much of the world's encrypted data is protected using mathematical equations with millions of reasonable solutions. These encryption models are too complicated for even supercomputers to solve within an acceptable period, which quantum systems can quickly solve.
Modern cryptography relies on symmetric and asymmetric standards. The significant difference is that symmetric cryptography is based on substitution and permutation (there is no underlying mathematical assumption) and uses a single key for encryption and decryption. In contrast, asymmetric key / public key cryptography uses two different keys for encryption and decryption.
Since the mid-90s, researchers have theorized that quantum computers can break current public-key cryptographic (PKC) systems. Their ability to concurrently test multiple hypotheses (using Shor's factorization OR Grover's exhaustive search) at unprecedented speeds will make both asymmetric and symmetric cryptosystems redundant.
5G is one of the most eagerly awaited technologies in the digital world, and with good reason. In the years ahead, 5G coupled with IoT, could revolutionize the integration of digital and physical worlds. What sets it apart from its predecessor?
5G speed - it is nearly 20x faster than 4G. An average-length movie takes 6 minutes to download on 4G and less than 20 seconds on 5G.
5G supports 10x more devices per sq. km. It will seamlessly handle many more devices within the same area – a boost for IoT infrastructure.
5G latency is 25x less than 4G. According to McKinsey, 5G will speed up the mainstream adoption of IoT across multiple industries: Transport, Manufacturing, Healthcare, to name a few.
5G and Quantum – the Perfect Storm
While quantum systems provide the compute, 5G provides the channel to connect more than just mobile networks (self-driving cars, personal medical tech), thus expanding the 'threat surface.' In a 5G world, secured communications are a critical component of connectivity, and post-quantum cryptography will play a key role.
Researchers globally are devising ways to embed quantum-safe cryptography into 5G networks without compromising QoS. I even came across a patent for a quantum-resistant 5G SIM card by a Swiss company that set an industry best practice in ITU-T X.1811 for quantum-safe 5G.
Cryptocurrency Wallets: A prime candidate for Quantum hacking
Imagine you forget the password of your Bitcoin wallet, which in theory had millions of dollars in the balance. With a quantum computer, you could unlock your wallet and save yourself many worries, which worries all cryptographers. If malicious players had a quantum computer, the first thing they would try and break is the Elliptic Curve digital signature algorithm, reverse-engineer your private key, forge your digital signature, and subsequently empty your wallet. Thankfully, we are still years away from that scenario, yet that is a telling tale for designing national digital currencies that are supposed to withstand the test of time. Likewise, this vital subject – including applications with legal consequences such as smart contracts enabled by blockchain technologies, which share the same technical basis and, therefore, vulnerabilities to quantum IT -, would need a dedicated article, hopefully soon as time enables it!
The real question is: when will quantum computers become a threat to public-key cryptography? As of December 2020, IBM claims to have a 65 qubit quantum computer and already delivering a 53-qubit model to a client (it would take around 1500 qubits to hack Bitcoin private keys). Quantum computers could achieve the required processing power range from as soon as 2024 to as far as 2040 per estimate.
How do we solve it?
Public Key Cryptography enables over 4.5 billion users to securely access over 200 million websites and engage in over $3 trillion of e-commerce transactions. Further, an estimated 20% of all IT applications rely on PKC and an even higher percentage on symmetric cryptography. According to Prof. Davor Pavuna of the École Polytechnique Fédérale de Lausanne, "several quantum prototypes might already become functional in 2023 (specifically in China)," and that potentially poses a severe protection challenge much earlier!"
Many companies are developing "post-quantum cryptography" (PQC) or "quantum-safe cryptography" (QSC) – algorithms whose security is not degraded by any known quantum computing algorithms. Typical ones are McEliece cryptosystem, Lattice-based cryptosystems, Code-based Cryptography, and Hash-based cryptography. While these developments promise 'quantum resistance,' they only reflect our current knowledge of quantum computing capabilities and have a relatively low benchmark set for their security. These methods aim to create mathematical problems that are too difficult for even a quantum computer to solve, with the US National Institute of Standards and Technology (NIST) planning to recommend a PQC standard by 2022-23 and already having done so specifically for hash-based signatures. Similarly, German BSI issued official guidance for using post-quantum key exchange mechanisms, somewhat differing from NIST, and the IETF standardized two hash-based signature schemes, LMS and XMSS, independently, also with differences. Last but not least, the ITU-T issued without much publicity an amended recommendation on IPTV security X.1197 Amd1 that provides comprehensive guidance on state-of-the-art standard PQC options available as of late 2019, for use in multimedia transmission, with a corrigendum issued in early 2020.
Applying the Solution
Post Quantum cryptography is a developing field. Although these algorithms are quantum-resistant in theory, there is an unpredictability about their efficacy. Secondly, these algorithms are heavy on memory and compute requirements, making it challenging to apply them universally. On the other hand, symmetric cryptography is more efficient and shows more resilience to quantum IT, yet needs an upgrade to accommodate larger key sizes. One such system I came across was a patent of the aforementioned Swiss company is eAES®, which enhances AES’s quantum resistance. It makes safely increasing the key size a reality (as per NIST’s IR 8105 guidance), a claim confirmed in a report by their competitor Kudelski Security on the former’s implementation for Intel® processors.
The transition to PQC standards requires a staged approach. To successfully navigate the impending cryptographic change, companies and governments must embrace crypto-agility - the ability to rapidly adapt and switch between multiple cryptographic standards at varying levels. We must support algorithms from different standardization bodies such as NIST, ETSI, the ITU-T, ISO/IEC, and the IEEE in a connected world with fractured standards.
Building a global quantum security alliance
We are just laying the foundations of this new security ecosystem; however, more work is needed to drive broader adoption. While the academic, innovation labs, and specialist technical communities are making some progress, cha
Article | February 10, 2020
Certain programming skills are always in demand—even among cybercriminals. Recently, an underground Russian forum known as XXS held a competition that sought to give away $15,000 in cash prizes to cybercriminal developers who could write an article or develop a proof-of-concept video on different topics, including searching for zero-day and one-day vulnerabilities and exploiting them, developing crypto algorithms, and how best to conduct an advanced persistent threat attack, according to an analysis conducted by security firm Digital Shadows.