IT on IT: Changing Perceptions of the IT Function and Why They Matter

February 21, 2019 | 105 views

There can be no doubt that IT is on the leading edge of digital transformation, driving organizational change that is fully bringing enterprises into the digital era. And there’s no shortage of think pieces and editorials on how this has shifted the role of IT. But in the discussion, what’s often missing is the perspective of the IT professionals on the front lines, the ones doing the work day in and day out, often spending significant time even after hours to keep the business on track.

Spotlight

Morphisec

Morphisec fundamentally changes the cybersecurity scene by shifting the advantage to defenders, keeping them ahead of attacks with moving target defense. Emerging from the national cyber security center and from some of the sharpest cyber security minds in Israel, Morphisec provides the ultimate threat prevention by making sure attackers never find the targets they seek.

OTHER ARTICLES
SOFTWARE

AI's Impact on Improving Customer Experience

Article | July 14, 2022

To enhance the consumer experience, businesses all over the world are experimenting with artificial intelligenace (AI), machine learning, and advanced analytics. Artificial intelligence (AI) is becoming increasingly popular among marketers and salespeople, and it has become a vital tool for businesses that want to offer their customers a hyper-personalized, outstanding experience. Customer relationship management (CRM) and customer data platform (CDP) software that has been upgraded with AI has made AI accessible to businesses without the exorbitant expenses previously associated with the technology. When AI and machine learning are used in conjunction for collecting and analyzing social, historical, and behavioral data, brands may develop a much more thorough understanding of their customers. In addition, AI can predict client behavior because it continuously learns from the data it analyzes, in contrast to traditional data analytics tools. As a result, businesses may deliver highly pertinent content, boost sales, and enhance the customer experience. Predictive Behavior Analysis and Real-time Decision Making Real-time decisioning is the capacity to act quickly and based on the most up-to-date information available, such as information from a customer's most recent encounter with a company. For instance, Precognitive's Decision-AI uses a combination of AI and machine learning to assess any event in real-time with a response time of less than 200 milliseconds. Precognitive's fraud prevention product includes Decision-AI, which can be implemented using an API on a website. Marketing to customers can be done more successfully by using real-time decisioning. For example, brands may display highly tailored, pertinent content and offer to clients by utilizing AI and real-time decisioning to discover and comprehend a customer's purpose from the data they produce in real-time. By providing deeper insights into what has already happened and what can be done to facilitate a sale through suggestions for related products and accessories, AI and predictive analytics are able to go further than historical data alone. This increases the relevance of the customer experience, increases the likelihood that a sale will be made, and increases the emotional connection that the customer has with a brand.

Read More
SOFTWARE

The Evolution of Quantum Computing and What its Future Beholds

Article | August 2, 2022

The mechanism of quantum computers will be entirely different from anything we humans have ever created or constructed in the past. Quantum computers, like classical computers, are designed to address problems in the real world. They process data in a unique way, though, which makes them a much more effective machine than any computer in use today. Superposition and entanglement, two fundamental ideas in quantum mechanics, could be used to explain what makes quantum computers unique. The goal of quantum computing research is to find a technique to accelerate the execution of lengthy chains of computer instructions. This method of execution would take advantage of a quantum physics event that is frequently observed but does not appear to make much sense when written out. When this fundamental objective of quantum computing is accomplished, and all theorists are confident works in practice, computing will undoubtedly undergo a revolution. Quantum computing promises that it will enable us to address specific issues that current classical computers cannot resolve in a timely manner. While not a cure-all for all computer issues, quantum computing is adequate for most "needle in a haystack" search and optimization issues. Quantum Computing and Its Deployment Only the big hyperscalers and a few hardware vendors offer quantum computer emulators and limited-sized quantum computers as a cloud service. Quantum computers are used for compute-intensive, non-latency-sensitive issues. Quantum computer architectures can't handle massive data sizes yet. In many circumstances, a hybrid quantum-classical computer is used. Quantum computers don't use much electricity to compute but need cryogenic refrigerators to sustain superconducting temperatures. Networking and Quantum Software Stacks Many quantum computing software stacks virtualize the hardware and build a virtual layer of logical qubits. Software stacks provide compilers that transform high-level programming structures into low-level assembly commands that operate on logical qubits. In addition, software stack suppliers are designing domain-specific application-level templates for quantum computing. The software layer hides complexity without affecting quantum computing hardware performance or mobility.

Read More
AI TECH

Language Models: Emerging Types and Why They Matter

Article | July 20, 2022

Language model systems, often known as text understanding and generation systems, are the newest trend in business. However, not every language model is made equal. A few are starting to take center stage, including massive general-purpose models like OpenAI's GPT-3 and models tailored for specific jobs. There is a third type of model at the edge that is intended to run on Internet of Things devices and workstations but is typically very compressed in size and has few functionalities. Large Language Models Large language models, which can reach tens of petabytes in size, are trained on vast volumes of text data. As a result, they rank among the models with the highest number of parameters, where a "parameter" is a value the model can alter on its own as it gains knowledge. The model's parameters, which are made of components learned from prior training data, fundamentally describe the model's aptitude for solving a particular task, like producing text. Fine-tuned Language Models Compared to their massive language model siblings, fine-tuned models are typically smaller. Examples include OpenAI's Codex, a version of GPT-3 that is specifically tailored for programming jobs. Codex is both smaller than OpenAI and more effective at creating and completing strings of computer code, although it still has billions of parameters. The performance of a model, like its capacity to generate protein sequences or respond to queries, can be improved through fine-tuning. Edge Language Models Edge models, which are intentionally small in size, occasionally take the shape of finely tuned models. To work within certain hardware limits, they are occasionally trained from scratch on modest data sets. In any event, edge models provide several advantages that massive language models simply cannot match, notwithstanding their limitations in some areas. The main factor is cost. There are no cloud usage fees with an edge approach that operates locally and offline. As significant, fine-tuned, and edge language models grow in response to new research, they are likely to encounter hurdles on their way to wider use. For example, compared to training a model from the start, fine-tuning requires less data, but fine-tuning still requires a dataset.

Read More
SOFTWARE

Low-code and No-code: A Business' New Best Friend

Article | July 5, 2022

Businesses are starting to integrate artificial intelligence (AI) into their workflow in greater numbers as a result of the growth of digital transformation and developments in machine learning (ML). As a result, platforms that need no coding, as well as their low-code counterparts, are becoming more popular. This development is a step toward computer science's long-term objective of automating manual coding. Low-code/no-code AI platforms will be beneficial to businesses in more data-driven industries like marketing, sales, and finance. AI can assist in a variety of ways, including automating invoicing, evaluating reports, making intelligent suggestions, and anticipating churn rates. How Does an Organization Look at Low-code/No-code as the Future? Developers and other tech-related positions are in high demand, particularly in the fields of AI and data science. Organizations have the chance to close the gap with the aid of citizen data scientists who don't require an AI professional to design unique AI solutions for many scenarios, thanks to low-code and no-code AI technologies. The demand for technological solutions and AI technologies is rising significantly as the technological landscape rapidly changes. AI systems, for example, require complex software that uses a lot of code, a variety of frameworks, and the Internet of Things (IoT). One person's capacity to comprehend every technical detail is strained by the array of complicated technology. Software delivery must be timely, effective, and secure while maintaining high standards. Conclusion Low-code AI solutions offer the speed, ease of use, and adaptability of ready-made software solutions while also drastically reducing the time to market for AI solutions and the cost of recruiting software and computer vision engineers. Organizations are free to construct the architecture, functionality, or pipeline that best suits their project, the sky being the limit. However, creating such unique models may be both costly and time-consuming. Therefore, employing low-code/no-code platforms would apply to particular pipeline actions that would streamline and accelerate the processes.

Read More

Spotlight

Morphisec

Morphisec fundamentally changes the cybersecurity scene by shifting the advantage to defenders, keeping them ahead of attacks with moving target defense. Emerging from the national cyber security center and from some of the sharpest cyber security minds in Israel, Morphisec provides the ultimate threat prevention by making sure attackers never find the targets they seek.

Related News

AI TECH,SOFTWARE,FUTURE TECH

CoreLogic’s Multi-year Alliance with Google Cloud Enables New Product Launch

CoreLogic | August 22, 2022

CoreLogic©, a leading global property data and analytics-driven solutions provider, has announced an extended relationship with Google Cloud to support the launch of its new CoreLogic Discovery Platform™. Fully built on Google Cloud’s secure and sustainable infrastructure, Discovery Platform provides a comprehensive property analytics environment and cloud-based data exchange for businesses across multiple sectors. Discovery Platform allows businesses — including CoreLogic’s core markets of property and real estate technology, mortgage lenders, marketers and insurance firms — to discover, integrate, analyze and model property insights to make critical business decisions faster. The multi-year relationship between CoreLogic and Google Cloud enables the development of a scalable platform built with several Google Cloud services including Dataproc, BigQuery, Anthos and Cloud Run to manage the data science workloads for predictive and prescriptive analytics. BigQuery is the petabyte-scale backend for the platform, enabling comprehensive property data views built from a wide array of CoreLogic and third-party data sets. Dataproc powers Discovery Platform facilitating advanced analytics and data science at scale. Typically, companies with data engineers and scientists spend weeks and even months on data wrangling before reaching the insights needed to drive business growth or mitigate risks. Through the collaboration with Google Cloud, CoreLogic’s Discovery Platform provides a fully secure and compliant environment with relevant data, tools, security, and governance. Using CoreLogic data models and insights allows companies to deploy secure and compliant data analytics workflows within minutes, thereby speeding up the delivery of mission-critical insights. “CoreLogic and Google Cloud solved a significant challenge in the lag-time required to spin up data analytics workbenches that could be preloaded with nationwide data assets, models, libraries and software and self-service training,” said John Rogers, chief innovation officer of CoreLogic. “Together, we were able to look at every part of the process—from onboarding to ingestion of data, modeling and exposure of that insight to the businesses’ operational platform—and cut the lag-time down by more than 50% to give clients access to the insights they need to move the needle on their business faster and easier than ever before.” “I’m excited to see our alliance with Google Cloud flourishing. “We’re providing a state-of-the-art analytical platform for our client’s mission-critical processes. Discovery Platform is born from a growing alliance of two major industry innovators. I see the future horizons our research and development product teams are working on, and I am excited to see what's next.” Patrick Dodd, president and CEO of CoreLogic “We value working with companies like CoreLogic to develop innovative technology solutions and services that enhance customer experience and deliver insights faster,” said Zac Maufe, global head of Financial Services Solutions at Google Cloud.“Our collaboration will support CoreLogic’s clients’ needs and enable the delivery of more comprehensive and efficient solutions for businesses in the real estate finance market.” About CoreLogic CoreLogic is a leading global property information, analytics, and data-enabled solutions provider. The company’s combined data from public, contributory, and proprietary sources includes over 4.5 billion records spanning more than 50 years, providing detailed coverage of property, mortgages and other encumbrances, consumer credit, tenancy, location, hazard risk and related performance information. The markets CoreLogic serves include real estate and mortgage finance, insurance, capital markets, and the public sector. CoreLogic delivers value to clients through unique data, analytics, workflow technology, advisory and managed services. Clients rely on CoreLogic to help identify and manage growth opportunities, improve performance, and mitigate risk. Headquartered in Irvine, CA, CoreLogic operates in North America, Western Europe, and Asia Pacific.

Read More

GENERAL AI

SK hynix Develops PIM, Next-Generation AI Accelerator

SK hynix | February 17, 2022

SK hynix or "the Company" announced on February 16 that it has developed PIM*, a next-generation memory chip with computing capabilities. It has been generally accepted that memory chips store data and CPU or GPU, like human brain, process data. SK hynix, following its challenge to such notion and efforts to pursue innovation in the next-generation smart memory, has found a breakthrough solution with the development of the latest technology. SK hynix plans to showcase its PIM development at the world's most prestigious semiconductor conference, 2022 ISSCC*, in San Francisco at the end of this month. The company expects continued efforts for innovation of this technology to bring the memory-centric computing, in which semiconductor memory plays a central role, a step closer to the reality in devices such as smartphones. For the first product that adopts the PIM technology, SK hynix has developed a sample of GDDR6-AiM (Accelerator* in memory). The GDDR6-AiM adds computational functions to GDDR6* memory chips, which process data at 16Gbps. A combination of GDDR6-AiM with CPU or GPU instead of a typical DRAM makes certain computation speed 16 times faster. GDDR6-AiM is widely expected to be adopted for machine learning, high-performance computing, and big data computation and storage. GDDR6-AiM runs on 1.25V, lower than the existing product's operating voltage of 1.35V. In addition, the PIM reduces data movement to the CPU and GPU, reducing power consumption by 80%. This, accordingly, helps SK hynix meet its commitment to ESG management by reducing carbon emissions of the devices that adopt this product. SK hynix also plans to introduce a technology that combines GDDR6-AiM with AI chips in collaboration with SAPEON Inc., an AI chip company that recently spun off from SK Telecom. The use of artificial neural network data has increased rapidly recently, requiring computing technology optimized for computational characteristics. We aim to maximize efficiency in data calculation, costs, and energy use by combining technologies from the two companies." Ryu Soo-jung, CEO of SAPEON Inc. Ahn Hyun, Head of Solution Development who spearheaded the development of the latest technology and product, said that "SK hynix will build a new memory solution ecosystem using GDDR6-AiM, which has its own computing function." He added that "the company will continue to evolve its business model and the direction for technology development." About SK hynix Inc. SK hynix Inc., headquartered in Korea, is the world's top tier semiconductor supplier offering Dynamic Random Access Memory chips ("DRAM"), flash memory chips ("NAND flash") and CMOS Image Sensors ("CIS") for a wide range of distinguished customers globally. The Company's shares are traded on the Korea Exchange, and the Global Depository shares are listed on the Luxemburg Stock Exchange

Read More

Tableau Expands Data Capabilities With AI-Driven Insights

Demand Gen Report | September 26, 2019

Tableau Software has released new data capabilities with the addition of Explain Data, which aims to enhance statistical analysis and help users gain AI-driven insights from their data. The company said that the new capability uses statistical methods to determine potential explanations for what could be influencing a data point, while providing further understanding of that explanation with interactive visualizations. With Explain Data, we're bringing the power of AI-driven analysis to everyone and making sophisticated statistical analysis more accessible so that, regardless of expertise, anyone can quickly and confidently uncover the 'Why?' behind their data," said Francois Ajenstat, Chief Product Officer at Tableau. "Explain Data will empower people to focus on the insights that matter and accelerate the time to action and business impact."

Read More

AI TECH,SOFTWARE,FUTURE TECH

CoreLogic’s Multi-year Alliance with Google Cloud Enables New Product Launch

CoreLogic | August 22, 2022

CoreLogic©, a leading global property data and analytics-driven solutions provider, has announced an extended relationship with Google Cloud to support the launch of its new CoreLogic Discovery Platform™. Fully built on Google Cloud’s secure and sustainable infrastructure, Discovery Platform provides a comprehensive property analytics environment and cloud-based data exchange for businesses across multiple sectors. Discovery Platform allows businesses — including CoreLogic’s core markets of property and real estate technology, mortgage lenders, marketers and insurance firms — to discover, integrate, analyze and model property insights to make critical business decisions faster. The multi-year relationship between CoreLogic and Google Cloud enables the development of a scalable platform built with several Google Cloud services including Dataproc, BigQuery, Anthos and Cloud Run to manage the data science workloads for predictive and prescriptive analytics. BigQuery is the petabyte-scale backend for the platform, enabling comprehensive property data views built from a wide array of CoreLogic and third-party data sets. Dataproc powers Discovery Platform facilitating advanced analytics and data science at scale. Typically, companies with data engineers and scientists spend weeks and even months on data wrangling before reaching the insights needed to drive business growth or mitigate risks. Through the collaboration with Google Cloud, CoreLogic’s Discovery Platform provides a fully secure and compliant environment with relevant data, tools, security, and governance. Using CoreLogic data models and insights allows companies to deploy secure and compliant data analytics workflows within minutes, thereby speeding up the delivery of mission-critical insights. “CoreLogic and Google Cloud solved a significant challenge in the lag-time required to spin up data analytics workbenches that could be preloaded with nationwide data assets, models, libraries and software and self-service training,” said John Rogers, chief innovation officer of CoreLogic. “Together, we were able to look at every part of the process—from onboarding to ingestion of data, modeling and exposure of that insight to the businesses’ operational platform—and cut the lag-time down by more than 50% to give clients access to the insights they need to move the needle on their business faster and easier than ever before.” “I’m excited to see our alliance with Google Cloud flourishing. “We’re providing a state-of-the-art analytical platform for our client’s mission-critical processes. Discovery Platform is born from a growing alliance of two major industry innovators. I see the future horizons our research and development product teams are working on, and I am excited to see what's next.” Patrick Dodd, president and CEO of CoreLogic “We value working with companies like CoreLogic to develop innovative technology solutions and services that enhance customer experience and deliver insights faster,” said Zac Maufe, global head of Financial Services Solutions at Google Cloud.“Our collaboration will support CoreLogic’s clients’ needs and enable the delivery of more comprehensive and efficient solutions for businesses in the real estate finance market.” About CoreLogic CoreLogic is a leading global property information, analytics, and data-enabled solutions provider. The company’s combined data from public, contributory, and proprietary sources includes over 4.5 billion records spanning more than 50 years, providing detailed coverage of property, mortgages and other encumbrances, consumer credit, tenancy, location, hazard risk and related performance information. The markets CoreLogic serves include real estate and mortgage finance, insurance, capital markets, and the public sector. CoreLogic delivers value to clients through unique data, analytics, workflow technology, advisory and managed services. Clients rely on CoreLogic to help identify and manage growth opportunities, improve performance, and mitigate risk. Headquartered in Irvine, CA, CoreLogic operates in North America, Western Europe, and Asia Pacific.

Read More

GENERAL AI

SK hynix Develops PIM, Next-Generation AI Accelerator

SK hynix | February 17, 2022

SK hynix or "the Company" announced on February 16 that it has developed PIM*, a next-generation memory chip with computing capabilities. It has been generally accepted that memory chips store data and CPU or GPU, like human brain, process data. SK hynix, following its challenge to such notion and efforts to pursue innovation in the next-generation smart memory, has found a breakthrough solution with the development of the latest technology. SK hynix plans to showcase its PIM development at the world's most prestigious semiconductor conference, 2022 ISSCC*, in San Francisco at the end of this month. The company expects continued efforts for innovation of this technology to bring the memory-centric computing, in which semiconductor memory plays a central role, a step closer to the reality in devices such as smartphones. For the first product that adopts the PIM technology, SK hynix has developed a sample of GDDR6-AiM (Accelerator* in memory). The GDDR6-AiM adds computational functions to GDDR6* memory chips, which process data at 16Gbps. A combination of GDDR6-AiM with CPU or GPU instead of a typical DRAM makes certain computation speed 16 times faster. GDDR6-AiM is widely expected to be adopted for machine learning, high-performance computing, and big data computation and storage. GDDR6-AiM runs on 1.25V, lower than the existing product's operating voltage of 1.35V. In addition, the PIM reduces data movement to the CPU and GPU, reducing power consumption by 80%. This, accordingly, helps SK hynix meet its commitment to ESG management by reducing carbon emissions of the devices that adopt this product. SK hynix also plans to introduce a technology that combines GDDR6-AiM with AI chips in collaboration with SAPEON Inc., an AI chip company that recently spun off from SK Telecom. The use of artificial neural network data has increased rapidly recently, requiring computing technology optimized for computational characteristics. We aim to maximize efficiency in data calculation, costs, and energy use by combining technologies from the two companies." Ryu Soo-jung, CEO of SAPEON Inc. Ahn Hyun, Head of Solution Development who spearheaded the development of the latest technology and product, said that "SK hynix will build a new memory solution ecosystem using GDDR6-AiM, which has its own computing function." He added that "the company will continue to evolve its business model and the direction for technology development." About SK hynix Inc. SK hynix Inc., headquartered in Korea, is the world's top tier semiconductor supplier offering Dynamic Random Access Memory chips ("DRAM"), flash memory chips ("NAND flash") and CMOS Image Sensors ("CIS") for a wide range of distinguished customers globally. The Company's shares are traded on the Korea Exchange, and the Global Depository shares are listed on the Luxemburg Stock Exchange

Read More

Tableau Expands Data Capabilities With AI-Driven Insights

Demand Gen Report | September 26, 2019

Tableau Software has released new data capabilities with the addition of Explain Data, which aims to enhance statistical analysis and help users gain AI-driven insights from their data. The company said that the new capability uses statistical methods to determine potential explanations for what could be influencing a data point, while providing further understanding of that explanation with interactive visualizations. With Explain Data, we're bringing the power of AI-driven analysis to everyone and making sophisticated statistical analysis more accessible so that, regardless of expertise, anyone can quickly and confidently uncover the 'Why?' behind their data," said Francois Ajenstat, Chief Product Officer at Tableau. "Explain Data will empower people to focus on the insights that matter and accelerate the time to action and business impact."

Read More

Events