BigData – Digital IT News https://digitalitnews.com IT news, trends and viewpoints for a digital world Wed, 12 Jun 2024 18:04:36 +0000 en-US hourly 1 https://wordpress.org/?v=5.4.15 Live GPFS Support Includes Cirata Data Migrator 2.5 for IBM https://digitalitnews.com/live-gpfs-support-includes-cirata-data-migrator-2-5-for-ibm/ Wed, 12 Jun 2024 15:00:25 +0000 https://digitalitnews.com/?p=11085 Cirata announced support for IBM General Parallel File System (GPFS), a cluster file system used for IBM Spectrum Scale data lakes. The newly released Live GPFS support, included in Cirata Data Migrator 2.5, significantly reduces latency between storage changes and replication or migration results, while also improving the performance and scalability of GPFS-resident data assets [...]

The post Live GPFS Support Includes Cirata Data Migrator 2.5 for IBM appeared first on Digital IT News.

]]>
Cirata announced support for IBM General Parallel File System (GPFS), a cluster file system used for IBM Spectrum Scale data lakes. The newly released Live GPFS support, included in Cirata Data Migrator 2.5, significantly reduces latency between storage changes and replication or migration results, while also improving the performance and scalability of GPFS-resident data assets migrating to the cloud.

Cirata Data Migrator lowers data latency while enhancing data migration performance for better data integration outcomes. The new Cirata Live GPFS capability initiates data transfer from a source GPFS file system as changes occur, without disruption to the storage environment. Ideally suited for cloud migrations, disaster recovery processes and continuous data migration use cases, Cirata Data Migrator with Live GPFS not only improves migration scale and performance but also supports fine-grained control and audit logging for assured compliance in increasingly multicloud data management environments.

According to Gartner®, “81% of respondents in organizations using public cloud said their organizations were using more than one cloud service provider (CSP).” “As the number of CSPs an organization uses increases, the complexity of managing them also increases. This can have negative consequences, such as performance issues associated with data latency, unplanned cost overruns or data egress fees, and difficulties with data integration.”(1)

“Modern multicloud workloads require high performance access to a common set of data to support scale-out storage and high availability. This is performed by IBM GPFS with great efficiency,” said Paul Scott-Murphy, Chief Technology Officer, Cirata. “By supporting this valued IBM GPFS capability as a Live source, Cirata Data Migrator gives organizations leveraging GPFS-resident data assets the confidence that they can flexibly migrate and replicate data with high performance and control to nearly any target, anywhere.”

Cirata Data Migrator with Live GPFS delivers the following benefits:

  • Reduces latency: By taking action immediately after change, Cirata Data Migrator minimizes the latency between source storage modifications and the actions to transfer or modify content at the targets. This can minimize recovery point objectives (RPO), and assist in architecting solutions with zero recovery time objectives (RTO).
  • Improves scale: Cirata Data Migrator avoids the need to repeatedly scan a source file system to identify change when Live migration is in effect. This is particularly beneficial for systems with very large numbers of storage items, allowing vastly more scalable outcomes and minimizing the overhead imposed on storage.
  • Enhances performance: By avoiding the need to repeatedly scan source storage, Cirata Data Migrator with Live GPFS avoids an entire class of overhead that solutions relying on scheduled jobs incur. The result is higher performance, and reduced computational overheads.
  • Enables finer control: Cirata Data Migrator offers fine-grained control of which data assets participate in migration. The Live GPFS feature incorporates these mechanisms natively, so that techniques like path mapping and pattern-based exclusion of file system content are incorporated into the core processing performed during data transfer, exposing all of the fine-grained selectivity directly to users if wanted.
  • Delivers auditable, accurate outcomes: Every action taken in response to changing source data is logged in auditable form, complementing the detailed reporting already available from migration verification to help ensure that migration outcomes are complete and accurate.

Cirata Data Migrator is a fully automated solution that automates Hadoop data transfer and integration and moves on-premises HDFS data, Hive metadata, local filesystem, or cloud data sources to any cloud or on-premises environment, even while those datasets are under active change. Cirata Data Migrator requires zero changes to applications or business operations and moves data of any scale without production system downtime, business disruption, and with zero risk of data loss. Migration targets supported include the Hadoop Distributed File System, Alibaba Cloud Object Storage Service, Amazon S3, Azure Data Lake Storage Gen 2, Google Cloud Storage, IBM Cloud Object Storage and Oracle Object Store.

Cirata Data Migrator 2.5 is available now including Live GPFS support; click here for more information.

Related News:

Cirata Data Migrator 2.5 Integrates with the Databricks Unity Catalog

Cirata Data Migrator Accessible Through Google Cloud Marketplace

1 Gartner, How to Optimize for Multicloud Data Management Deployments, Masud Miraz, Adam Ronthal, 13 May 2024. GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.

The post Live GPFS Support Includes Cirata Data Migrator 2.5 for IBM appeared first on Digital IT News.

]]>
Cirata Data Migrator 2.5 Integrates with the Databricks Unity Catalog https://digitalitnews.com/cirata-data-migrator-2-5-integrates-with-the-databricks-unity-catalog/ Fri, 07 Jun 2024 15:00:25 +0000 https://digitalitnews.com/?p=11045 Cirata announced the launch of Cirata Data Migrator 2.5, now featuring native integration with Databricks Unity Catalog. This enhancement deepens the Cirata and Databricks partnership, centralizing data governance and access control to streamline data operations and accelerate time-to-value for enterprises. Databricks Unity Catalog delivers a unified governance layer for data and artificial intelligence (AI) within [...]

The post Cirata Data Migrator 2.5 Integrates with the Databricks Unity Catalog appeared first on Digital IT News.

]]>
Cirata announced the launch of Cirata Data Migrator 2.5, now featuring native integration with Databricks Unity Catalog. This enhancement deepens the Cirata and Databricks partnership, centralizing data governance and access control to streamline data operations and accelerate time-to-value for enterprises.

Databricks Unity Catalog delivers a unified governance layer for data and artificial intelligence (AI) within the Databricks Data Intelligence Platform. Using Unity Catalog enables organizations to seamlessly govern their structured and unstructured data, machine learning modules, notebooks, dashboards and files on any cloud or platform.

By integrating with Databricks Unity Catalog, Cirata Data Migrator unlocks the ability to execute analytics jobs as soon as possible or to modernize data in the cloud. With the ability to support Databricks Unity Catalog’s functionality for stronger data operations, access control, accessibility and search, Cirata Data Migrator automates large-scale transfer of data and metadata from existing data lakes to cloud storage and database targets, even while changes are being made by the application at the source. Using Cirata Data Migrator 2.5, users can now select the Databricks agent and define the use of Unity Catalog with Databricks SQL Warehouse. This helps data science and engineering teams maximize the value of their entire data estate while benefiting from their choice of metadata technology in Databricks.

“As a long-standing partner, Cirata has helped many customers in their legacy Hadoop to Databricks migrations,” said Siva Abbaraju, Go-to-Market Leader, Migrations, Databricks. “Now, the seamless integration of Cirata Data Migrator with Unity Catalog enables enterprises to capitalize on our Data and AI capabilities to drive productivity and accelerate their business value.”

“Cirata is excited by the customer benefits that come from native integration with the Databricks Unity Catalog,” said Paul Scott-Murphy, Chief Technology Officer, Cirata. “By unlocking a critical benefit for our customers, we are furthering the adoption of data analytics, AI and ML and empowering data teams to drive more meaningful data insights and outcomes.”

This expanded Cirata-Databricks partnership builds on previous product integrations between the two companies. In 2021, the companies partnered to automate metadata and data migration capabilities to Databricks and Delta Lake on Databricks, respectively. With data available for immediate use, the integration eliminated the need to construct and maintain data pipelines to transform, filter and adjust data, along with the significant up-front planning and staging.

Cirata Data Migrator is a fully automated solution that automates Hadoop data transfer and integration and moves on-premises HDFS data, Hive metadata, local filesystem, or cloud data sources to any cloud or on-premises environment, even while those datasets are under active change. Cirata Data Migrator requires zero changes to applications or business operations and moves data of any scale without production system downtime, business disruption, and with zero risk of data loss.

Cirata Data Migrator 2.5 is available now with native integration with the Databricks Unity Catalog. For more information, join Cirata for the upcoming webinar, “Accelerate Hadoop Migration to Databricks,” here.

Related News:

Cirata Gerrit MultiSite Simplifies Gerrit Instance Data Across Multiple Sites

Cirata Data Migrator Accessible Through Google Cloud Marketplace

The post Cirata Data Migrator 2.5 Integrates with the Databricks Unity Catalog appeared first on Digital IT News.

]]>
Qlik AI Council Warned AI Adoption is a Risky Gamble Without Data Integrity https://digitalitnews.com/qlik-ai-council-warned-ai-adoption-is-a-risky-gamble-without-data-integrity/ Wed, 05 Jun 2024 18:00:44 +0000 https://digitalitnews.com/?p=11012 During Qlik Connect 2024, the Qlik AI Council issued a stern caution to businesses: embracing AI without prioritizing data integrity poses significant risks. In their panel discussion, industry experts underscored that overlooking data quality could result in severe outcomes, such as operational failures, regulatory breaches, and financial setbacks. The Council’s collective message emphasizes the importance [...]

The post Qlik AI Council Warned AI Adoption is a Risky Gamble Without Data Integrity appeared first on Digital IT News.

]]>
During Qlik Connect 2024, the Qlik AI Council issued a stern caution to businesses: embracing AI without prioritizing data integrity poses significant risks. In their panel discussion, industry experts underscored that overlooking data quality could result in severe outcomes, such as operational failures, regulatory breaches, and financial setbacks. The Council’s collective message emphasizes the importance of robust data foundations that facilitate efficient, result-oriented, and less risky AI adoption. Ensuring diversity, timeliness, accuracy, security, discoverability, and machine-friendly accessibility of data are vital for the success of AI initiatives.

The Qlik AI Council outlined two primary risks for businesses that fail to prioritize data integrity and analytics foundations in their AI adoption strategies:

  • Slow Adoption and Competitive Lag: Companies that neglect the integrity of their data and analytics foundations will be hesitant to adopt AI, causing them to fall behind their competitors. This delay in AI adoption can result in missed opportunities and a widening gap that becomes increasingly difficult to bridge.
  • Adoption Without Integrity Leads to Crises: Businesses that rush to implement AI without focusing on the caliber and quality of their data risk facing severe consequences. These can include governance issues, regulatory breaches, inefficiencies, and poor decision-making driven by biased or inaccurate data. Such missteps can lead to significant financial losses and reputational damage.

 

Reflecting on the current state of enterprise AI adoption, members of the Qlik AI Council commented:

“Ensuring data integrity is crucial for the responsible deployment of AI. Without accurate, diverse, and secure data, AI systems have a greater propensity to perpetuate biases and lead to significant ethical issues,” noted Dr. Rumman Chowdhury, a leading expert in ethical AI development. “Transparency, fairness, and accountability must be embedded at every stage of AI development to build trust and ensure the technology benefits all users.”

“Generative AI has the potential to revolutionize industries and drive competitiveness, but its benefits hinge on maintaining public trust,” emphasized Nina Schick, a leading authority on AI and geopolitics. “Ensuring the authenticity and reliability of AI-generated content is crucial to prevent misinformation and uphold the integrity of our digital landscape.”

“Implementing AI in a socially responsible manner is critical for aligning with global sustainability goals,” stated Kelly Forbes, a distinguished expert in AI governance. “Businesses must adopt responsible and sustainable data practices to ensure that AI contributes to long-term economic growth and societal well-being. This approach not only mitigates risks but also fosters trust and accountability.”

“Advanced AI methodologies, like graph neural networks, hold immense potential for solving complex business problems,” noted Dr. Michael Bronstein, a pioneer in this field. “High-quality and well-structured data is essential for these technologies to succeed, enabling innovative applications that range from drug discovery to interpreting non-human communication and can potentially lead to transformative outcomes”

The Qlik AI Council was launched in January 2024 to provide continuous guidance and insight into the rapidly evolving AI landscape. Comprising distinguished experts in AI and ethics, the Council advises Qlik’s R&D and solutions teams, ensuring that AI innovations are both cutting-edge and ethically sound. By focusing on trustworthy, reliable, and minimally risky AI development, the Council aligns Qlik’s solutions with customer needs and broader societal impacts. Their expertise supports Qlik in delivering AI solutions that drive significant business outcomes while maintaining high levels of integrity.

The Council’s panel session at Qlik Connect 2024 delivered a critical message: data integrity is essential for successful AI adoption. Neglecting this can lead to severe operational, financial, and reputational issues. The Council members stressed that a focus on data and analytics foundations is vital for ethical AI development, public trust, sustainability, and innovative problem-solving. Businesses must prioritize data accuracy, diversity, security, and structure to harness AI’s full potential effectively.

To learn more about Qlik Connect 2024 or the Qlik AI Council, visit the website here.

Related News:

Qlik Adds Talend to Deliver a Leading Portfolio for Data Integration

Without Intelligent Data Infrastructure Up to 20% of AI Initiatives Fail

The post Qlik AI Council Warned AI Adoption is a Risky Gamble Without Data Integrity appeared first on Digital IT News.

]]>
Kyndryl Threat Insights Managed Service Available Using Amazon Security Lake https://digitalitnews.com/kyndryl-threat-insights-managed-service-available-using-amazon-security-lake/ Wed, 05 Jun 2024 17:45:09 +0000 https://digitalitnews.com/?p=11010 Kyndryl revealed the availability of the Kyndryl Threat Insights Managed Service through Amazon Security Lake. This service automatically consolidates security data from an organization’s Amazon Web Services (AWS) environments, offering customers increased visibility for better identification, mitigation, and response to advanced cybersecurity threats. The announcement builds on Kyndryl’s successful collaboration with AWS, under which the [...]

The post Kyndryl Threat Insights Managed Service Available Using Amazon Security Lake appeared first on Digital IT News.

]]>
Kyndryl revealed the availability of the Kyndryl Threat Insights Managed Service through Amazon Security Lake. This service automatically consolidates security data from an organization’s Amazon Web Services (AWS) environments, offering customers increased visibility for better identification, mitigation, and response to advanced cybersecurity threats.

The announcement builds on Kyndryl’s successful collaboration with AWS, under which the companies have co-invested and co-innovated to build differentiated, scalable security and resiliency offerings. The Kyndryl Threat Insights Managed Service uses Amazon Security Lake to centralize data and apply analysis, resulting in deep insights not possible with siloed, disparate security technologies. Integrated into Kyndryl Bridge, customers benefit from a consolidated view of security risks that could negatively impact their business.

Through the service, Kyndryl provides customers with options for:

  • Enhanced cyber resilience: Using an integrated approach coupling security and resiliency improves a customer’s ability to anticipate, protect, withstand and recover from cyber incidents. Kyndryl Threat Insights Managed Service bolsters an organization’s ability to better anticipate and protect against cyber threats and business-critical operational disruptions.
  • Improved visibility and simplicity to mitigate cybersecurity risk: Mitigating risk is a key challenge as digital estates transform and grow in complexity to meet business objectives. Through the service, customers use a single pane view into their security and resiliency risks. This helps accelerate their decision-making capabilities for faster intelligence-driven threat detection and more effective and informed response.
  • Security operations, including artificial intelligence (AI) / machine learning (ML)-empowered analysis: Anomaly detection with threat intelligence enhancement, coupled with AI/ML-based analysis, allows the service to assess security data, provide insights and prioritize investigation. This results in actionable security insights that allow customers to benefit from an integrated approach for greater detection and automated response, which simultaneously enhances customers’ compliance initiatives.

“Security leaders understand that cyber incidents are inevitable, and they know it’s essential to have sufficient visibility to drive quick response,” said Michelle Weston, VP of Security & Resiliency, Kyndryl. “By joining Kyndryl’s deep cybersecurity expertise with the benefits of Amazon Security Lake, we can address the urgent need for an integrated approach to security and resilience, empowering our customers to not only anticipate and protect against threats but also to quickly withstand and recover.”

For more information about Kyndryl’s collaboration with AWS or Kyndryl Threat Insights Managed Service, please visit the website here.

Related News:

Kyndryl’s Unified SIM to Deliver Integrated Global Connectivity

SoftServe’s Master Key Accelerator Connects AWS and Pega Platforms

The post Kyndryl Threat Insights Managed Service Available Using Amazon Security Lake appeared first on Digital IT News.

]]>
Zendata Team Emerges from Stealth with $2 Million Seed Funding https://digitalitnews.com/zendata-team-emerges-from-stealth-with-2-million-seed-funding/ Wed, 29 May 2024 15:00:09 +0000 https://digitalitnews.com/?p=10939 Zendata has come out of stealth mode with $2 million in funding to transform data and security management for security teams. The seed round was led by PayPal Ventures, First-hand Alliance (run by Salesforce Alumni), Geek Ventures, and Altari Ventures. The funding will primarily be used to enhance the Zendata platform and expand its customer [...]

The post Zendata Team Emerges from Stealth with $2 Million Seed Funding appeared first on Digital IT News.

]]>
Zendata has come out of stealth mode with $2 million in funding to transform data and security management for security teams. The seed round was led by PayPal Ventures, First-hand Alliance (run by Salesforce Alumni), Geek Ventures, and Altari Ventures. The funding will primarily be used to enhance the Zendata platform and expand its customer base to meet the rising global demand from organizations navigating the complexities of modern AI and data governance.

Zendata was co-founded by industry veterans who bring decades of extensive experience working on AI and data platforms at multiple Fortune 100 companies, including PayPal. Zendata Co-Founder and CEO, Narayana Pappu, has focused his career on incubating and creating enterprise-grade data products, with experience in financial services, risk and privacy. Co-Founder Pedro Pinango has spent more than a decade leading multidisciplinary teams to build digital products for startups. Together, they are launching a Zendata team to redefine how organizations of all sizes can effortlessly integrate data security, AI governance and privacy solutions across the entire data lifecycle.

“The founding Zendata team brings with them a strong network in Silicon Valley and beyond. They understand the convergence of the CIO/CISO and CDO roles, and the need for a platform that bridges the gap between these roles and engineering teams,” said Ihar Mahaniok, Managing Partner, Geek Ventures. “With AI governance becoming a significant tailwind, and the depreciation of third-party cookies driving increased focus on first-party data, the demand for Zendata’s solutions is expected to grow rapidly in the coming years. We are excited to partner with Narayana, Pedro, and the Zendata team to contribute to their success in addressing the growing market need to help companies manage AI and data risks effectively.”

Data breaches are becoming more common, with cybercriminals continuously developing new methods to exploit vulnerabilities in systems and networks. According to the 2023 IBM Cost of a Data Breach report, 52% of data breaches involve some form of customer Personal Identifiable Information (PII). Safeguarding sensitive data is crucial to maintaining trust among customers and stakeholders and preventing reputational damage. Having a deep understanding of the key frameworks and regulations that make this possible empowers organizations to remain compliant and stay ahead of potential threats.

Today, the data risk management market is experiencing increasing demand due to factors such as rising regulatory pressures and the growing adoption of AI and LLMs in businesses. Unfortunately, many companies and governments lack adequate context and visibility into how their data is being used, exposing them to substantial risks and liabilities. Zendata AI Governance mitigates AI adoption risks to enable organizational agility. The platform empowers organizations to gain comprehensive insights and control over their data usage, enabling them to make informed decisions and stay compliant with evolving regulations around data privacy and AI governance.

“At Zendata, we believe that AI risk is at the heart of data risk. Our no-code data security and privacy compliance platform helps businesses of all sizes navigate the complexities of data privacy and data protection regulations by integrating privacy by design across the entire data lifecycle,” said Narayana Pappu, CEO at Zendata. “Our customers have validated the strength of our platform. We employ detection, prevention and correction controls that incorporate privacy features to protect your organizations most sensitive data and address LLM risks. With the support of our investors, we will expand our go-to-market strategy and remain committed to continuous product enhancement driven by customer needs, becoming the go-to solution for organizations seeking to navigate the complex landscape of AI governance, data privacy and security”.

Zendata is also announcing that the company has been accepted into the highly selective Topline program run by Race Capital, the investors behind Databricks. This opportunity will open up new avenues for growth and future funding for the company.

To learn more about Zendata and the Zendata team, visit the website here.

Related News: 

Browser Supply Chain Secured with c/side AI-Fueled Security Solution

JETCOOL to Receive ARPA-E Funding to Create Efficient Data Centers

The post Zendata Team Emerges from Stealth with $2 Million Seed Funding appeared first on Digital IT News.

]]>
HITRUST i1 Certification Earned by Causeway Solutions https://digitalitnews.com/hitrust-i1-certification-earned-by-causeway-solutions/ Wed, 22 May 2024 15:30:57 +0000 https://digitalitnews.com/?p=10905 Causeway Solutions is proud to announce that the Causeway Solutions Data Analytics Platform has achieved certified status for information security from HITRUST. HITRUST Implemented, 1-year (i1) Certification demonstrates that Causeway Solutions’ Data Analytics Platform is leveraging a set of curated controls to protect against current and emerging threats. The HITRUST i1 Validated Assessment and Certification [...]

The post HITRUST i1 Certification Earned by Causeway Solutions appeared first on Digital IT News.

]]>
Causeway Solutions is proud to announce that the Causeway Solutions Data Analytics Platform has achieved certified status for information security from HITRUST.

HITRUST Implemented, 1-year (i1) Certification demonstrates that Causeway Solutions’ Data Analytics Platform is leveraging a set of curated controls to protect against current and emerging threats. The HITRUST i1 Validated Assessment and Certification helps organizations address cybersecurity challenges and remain cyber resilient over time.

“This is the second year in a row that we’ve achieved HITRUST i1 Certification,” said William Skelly, CEO of Causeway Solutions. “I’m incredibly proud of the way our dedicated team members are adhering to strong cybersecurity practices in support of our clients and partners. We’re always reimagining and optimizing the use of AI, machine learning and our proprietary algorithms to serve clients in diverse industries, including healthcare, sports, entertainment, as well as political campaign strategists. The certification demonstrates our commitment to high standards for cybersecurity and data protection.”

“HITRUST is continually innovating to find new and creative approaches to address information security challenges,” said Jeremy Huval, Chief Innovation Officer at HITRUST. “Causeway Solutions’ HITRUST i1 Certification is the evidence that they are at the forefront of industry best practices for information risk management and cybersecurity.”

“Since our HITRUST certification in 2023, we kept refining our processes and successfully improved our scores across all categories of the stringent qualifications,” added Tim Duer, Vice President of Healthcare & Enterprise Insights at Causeway Solutions. “Our team did a phenomenal job with no security breach – mitigating all phishing and spam attacks! This speaks volumes about the strength of our technology team and the infrastructure we have in place to keep our clients’ data secure.”

To learn more about the new HITRUST i1 Certification and Causeway Solutions, visit the website here.

Related News:

Future-Proofing Marketing and Data Strategies in a Post-Cookie World

Essential Checklist for Managing Accidental Big Data

The post HITRUST i1 Certification Earned by Causeway Solutions appeared first on Digital IT News.

]]>
EIS, Enterprise Intelligence Services, Advanced with DXC Technology https://digitalitnews.com/eis-enterprise-intelligence-services-advanced-with-dxc-technology/ Mon, 20 May 2024 17:00:18 +0000 https://digitalitnews.com/?p=10880 DXC Technology is excited to reveal a partnership with Dell Technologies aimed at enhancing Enterprise Intelligence Services (EIS). This collaboration represents a major advancement in utilizing advanced technologies like AI, machine learning, data analytics, and intelligent automation to transform data into a more holistic view of the enterprise. The EIS architecture built on the Dell [...]

The post EIS, Enterprise Intelligence Services, Advanced with DXC Technology appeared first on Digital IT News.

]]>
DXC Technology is excited to reveal a partnership with Dell Technologies aimed at enhancing Enterprise Intelligence Services (EIS). This collaboration represents a major advancement in utilizing advanced technologies like AI, machine learning, data analytics, and intelligent automation to transform data into a more holistic view of the enterprise.

The EIS architecture built on the Dell Validated Design for Generative AI with NVIDIA for Model Customization focuses on harnessing advanced technologies to empower organizations to make data-driven decisions, enhance customer experiences, and optimize operational efficiency. At its core, EIS aims to build a robust AI foundation to drive future enterprise initiatives and enable proactive decision-making based on real-time insights.

Through DXC, the convergence of key technologies is made possible such as data management, business intelligence (BI), AI, IoT, and cloud computing to extract actionable insights from vast data sets. This will enable real-time analytics, predictive modeling, and anticipation of market trends, providing organizations with a competitive edge.

“The step change in technological evolution today is that machines can understand and interact in human language. Machines already could sense the world around us and thus have the ability now to be an expert at almost any task,” said Sunil Menon, Global Leader of DXC’s Data & AI business. “This means that AI will be embedded in all aspects of an enterprise, from product development, personalized customer experience to operations, regulatory and finance. The challenge is that AI needs variety of quality, secure and ethical data to achieve this goal, and data is hard. AI needs data and data needs AI. By integrating AI as the foundation of DXC’s Enterprise Intelligence System, AI will permeate all aspects of the business, ensuring that every operational facet and customer interaction is enhanced by intelligence and foresight.”

The collaboration will enable DXC to build a robust AI foundation, serving as the primary layer for future enterprise initiatives. This foundation will facilitate data-driven decision-making, help understand market trends, enhance customer experiences, and optimize operational efficiency.

Incorporating advanced technologies, DXC aims to address the high failure rate of traditional data warehouse projects while handling the growing complexity and volume of business data more effectively, ensuring the success of data-driven initiatives.

To learn more about how DXC Technology advances Enterprise Intelligence Services, EIS, visit the website here.

Related News:

Dell AI PCs Helps Organizations Create a Modern Workplace

AI and GenAI Achieve Faster Performance with Dell and NVIDIA DGX SuperPOD

The post EIS, Enterprise Intelligence Services, Advanced with DXC Technology appeared first on Digital IT News.

]]>
Not All Data is Created Equal: the Value of Data Granularity https://digitalitnews.com/not-all-data-is-created-equal-the-value-of-data-granularity/ Mon, 20 May 2024 15:00:56 +0000 https://digitalitnews.com/?p=10859 Evolving into the digital age, many modern organizations have developed an understandable obsession with data. On the plus side, they’ll generously fund initiatives that support the collection and storage of data. But on the negative side, there’s little realization that this often results in collecting data for data’s sake. Executives overeager to bring their companies [...]

The post Not All Data is Created Equal: the Value of Data Granularity appeared first on Digital IT News.

]]>
Evolving into the digital age, many modern organizations have developed an understandable obsession with data. On the plus side, they’ll generously fund initiatives that support the collection and storage of data. But on the negative side, there’s little realization that this often results in collecting data for data’s sake. Executives overeager to bring their companies into the 21st century by making them data-driven are part of the way there. Yes, they have realized that data is of little value without deriving insights from it. But there is so much data being collected that it becomes difficult to see the wood for the trees, and truly extract the sort of analysis that leads to meaningful shifts in strategy. What is this critical element missing? Simply this: large sets of data can lead to misleading conclusions. The real game changer is the insights that can be pulled out of nuggets and granules of data. But the catch is that you have to know where to look.

Valuing data quantity over quality equates to looking for that proverbial needle in a haystack. While it’s true that there’s a piece of valuable information somewhere in the massive dataset, how much labor and capital will it take to retrieve it? If the wrong piece of data is retrieved and developed into a plan for action, how long will it take for the organization to course correct?

Companies on the cutting edge are now beginning to take note that the adage “data is the new oil” is out of date. Instead, these companies prioritize refining their algorithms, allowing them to run lean when it comes to data. For example, Facebook no longer has to worry about encroaching on the user’s privacy, because their advanced algorithm can use simple, publicly available data to generate the insights necessary to sell ads.

The Level of Detail in Data

A data glut riddled with hidden costs can be avoided by honing-in on the correct level of data granularity before the collection process takes place. Data granularity measures the detail present in data. For example, data that measures yearly transactions across all stores in a country would have low granularity. While data that measures individual stores’ transactions by the second would have incredibly high granularity.

However, there’s a danger in organizations assuming that increased granularity of data directly correlates with its value or applicability. When someone is lost, the farther they zoom into a map isn’t proportionate to their chances of finding their way home. There is an optimal level of data granularity for each function within an organization—a uniform level of granularity throughout an organization might benefit some functions, but hinder others.

Consider the following two examples of organizations using the right and wrong levels of data granularity. Organization A succeeds by understanding the specific price sensitivity of each of their product and consumer combinations. While organization B bleeds margin by pushing a blanket top-down price change of +5% on every product and customer combination—solely informed by the data point that costs have increased an average of 5%. Both are informed by data, but the second has such a low granularity that it will inevitably deliver poor results.

Agility Matters in Uncertain Times

The pandemic, ensuing supply chain crisis, and geopolitical instability have shed light on the holes in traditional pricing models. Traditional pricing models assume that customers are highly price sensitive, that price is the deciding factor in choosing between two comparable items. However, COVID challenged these assumptions—retailers were surprised when massive discounts did little to remedy overstocks. Inflation and public health concerns governed spending patterns. Linear pricing models could not take these external factors into account. But with more granular data, pricing models can be developed which factor in location, demographics, seasonality, and countless other intangibles.

Another disadvantage of the traditional approach to pricing is that it’s inflexible and unresponsive to rapid changes in spending patterns. With today’s volatile macroeconomic conditions, agility is crucial—alternative trade routes, suppliers, and customer bases need to be established on the fly. Predictive models try to foresee global crises but are ultimately playing an unwinnable game. On the other hand, prescriptive pricing models based on granular data react so quickly that predicting the future becomes unnecessary.

Learn to Frame Complex Problems, Let AI Do the Rest

In the coming years, training and education will place more emphasis on asking the right questions rather than answering the wrong ones. While AI can automate away tedious, manual tasks, it lacks the critical thinking skills and independence necessary to frame complex problems. Data granularity goes hand in hand with this cultural shift—the collection of data will become cheap and accessible, yet granularity issues will require the skills that make humans irreplaceable.

AI isn’t automatically added value. Without talented human capital to write prompts, the lure of these tech investments can potentially do more harm than good. For example, asking AI“How do I sell more inventory?” is the wrong question: the machine would suggest applying massive discounts across the board. Rather one should ask “how can I lift my market share, sales and margin while preserving my value perception?” — because the answer will be a balanced view of the complex outcomes that typical firms optimize for. So, the approach should be to identify the most productive goal for the machine. If you forget what you actually want, the outcome can damage the business, even as the machine gets better at doing the wrong thing. The key is setting the right goals and putting rules in place that ensures nothing critical is sacrificed along the way.

For many, transitioning business processes from being manual to being automated and data-back has been difficult—often the wrong KPIs are emphasized and organizations over-fit or under-fit analytic data models. However, by focusing on the right granularity, organizations can unlock the full potential of artificial intelligence.

To learn how data granularity can help your organization, visit the website here.

Related News:

Unified Frontiers: Revolutionizing Operations Through Centralized IT/OT Integration

SQream Integrates with Dataiku for Advanced Big Data Analytics Technology

The post Not All Data is Created Equal: the Value of Data Granularity appeared first on Digital IT News.

]]>
TopicLake Policy Insights Began Proving its Role in Simplifying Federal Regulations https://digitalitnews.com/topiclake-policy-insights-began-proving-its-role-in-simplifying-federal-regulations/ Tue, 14 May 2024 17:30:29 +0000 https://digitalitnews.com/?p=10794 In a remarkable demonstration of dedication to improving regulatory transparency, Gadget Software Inc., a leader in AI-driven data analysis solutions, has unveiled its newest product, TopicLake Policy Insights. This innovative service has quickly gained traction, garnering over 50,000 users worldwide in just 50 days, underscoring its essential role in simplifying regulations for all the various [...]

The post TopicLake Policy Insights Began Proving its Role in Simplifying Federal Regulations appeared first on Digital IT News.

]]>
In a remarkable demonstration of dedication to improving regulatory transparency, Gadget Software Inc., a leader in AI-driven data analysis solutions, has unveiled its newest product, TopicLake Policy Insights. This innovative service has quickly gained traction, garnering over 50,000 users worldwide in just 50 days, underscoring its essential role in simplifying regulations for all the various federal agencies.

Launched in mid-March, TopicLake Policy Insights provides daily updates on new regulations, offering users over 15 million unique insights about policies covering the past four years. By the end of the year, this service aims to expand its coverage to include 24 years of policy insights, making it an indispensable tool for professionals and individuals alike seeking to navigate the complexities of federal regulations.

Key Features of TopicLake Policy Insights include:

  • Daily Regulation Updates: Stay informed with the latest insights on new and existing regulations.
  • Comprehensive Coverage: Access insights from four years of policies, with plans to expand to 24 years by year-end.
  • User-Friendly Interface: Designed with feedback from users who prefer simpler regulation language and the ability to navigate multiple regulations and agencies in one place.
  • Mobile Accessibility: Tailored for users who prefer accessing information on mobile devices.

 

Additionally, Gadget Software has announced a feature update for TopicLake Policy Alerts, scheduled for Q3. This update will introduce notifications about agencies enacting or proposing new regulations, with customizable options for users to choose which agencies they want to track.

“Our goal with TopicLake Policy Insights was to demystify the complexities of federal regulations, making them accessible and manageable,” said Maxwell Riggsbee, Jr., Founder and Chief Product Officer of Gadget Software. “Reaching this user milestone so quickly not only validates our solution but also reinforces our commitment to continuously improve and expand our offerings.”

To explore how TopicLake Policy Insights can transform your understanding of federal regulations, visit the website herewww.topiclake.com.

Related News:

Cirrus by Panasonic 2.0 Launched its New Platform to Manage V2X Operations

MCPTT T-Mobile Mission Critical Push-to-Talk Launched

The post TopicLake Policy Insights Began Proving its Role in Simplifying Federal Regulations appeared first on Digital IT News.

]]>
Cirata Data Migrator Accessible Through Google Cloud Marketplace https://digitalitnews.com/cirata-data-migrator-accessible-through-google-cloud-marketplace/ Tue, 14 May 2024 14:00:48 +0000 https://digitalitnews.com/?p=10790 Cirata has announced that Cirata Data Migrator is now offered on the Google Cloud Marketplace. This integration of Cirata’s high-volume data migration technology with Google Cloud simplifies the process for shared clients to access the necessary technology for migrating Hadoop data lakes across multiple cloud environments, including Google Cloud, thereby improving the optimization of cloud [...]

The post Cirata Data Migrator Accessible Through Google Cloud Marketplace appeared first on Digital IT News.

]]>
Cirata has announced that Cirata Data Migrator is now offered on the Google Cloud Marketplace. This integration of Cirata’s high-volume data migration technology with Google Cloud simplifies the process for shared clients to access the necessary technology for migrating Hadoop data lakes across multiple cloud environments, including Google Cloud, thereby improving the optimization of cloud expenses.

Cirata Data Migrator is a fully automated solution that moves on-premises HDFS data, Hive metadata, local filesystem, or cloud data sources to any cloud or on-premises environment, even while those datasets are under active change. Cirata Data Migrator requires zero changes to applications or business operations and moves data of any scale without production system downtime, business disruption, and with zero risk of data loss. By making the solution available on Google Cloud Marketplace, customers benefit from increased migration scalability and security and achieve future readiness, agility, and efficiency in data automation and integration.

“Bringing Cirata Data Migrator to Google Cloud Marketplace will help customers quickly deploy, manage, and grow the platform on Google Cloud’s trusted, global infrastructure,” said Dai Vu, Managing Director, Marketplace & ISV GTM Programs at Google Cloud. “Cirata can now securely scale and support customers on their digital transformation journeys.”

“We are delighted to bring Cirata’s data automation and integration capabilities to Google Cloud Marketplace,” said Chris Cochran, CRO Americas and Head of Global Alliances, Cirata. “None of us can imagine a world without Google Cloud, specifically the wave that Google Cloud Marketplace is making. Cirata’s availability on Google Cloud Marketplace will support secure data automation and incorporation of AI platforms within the solution, further amplifying its power essential to hundreds of integrated applications.”

Google Cloud Marketplace offers an affordable, scalable solution to accelerate digital transformation strategies with online discovery, flexible purchasing, and fulfillment of enterprise-grade cloud solutions. For more information about Cirata Data Migrator, view it in the Google Cloud Marketplace here.

Related News:

Unveiling the Cloud Data Conundrum: Addressing Imbalanced Costs for Competitive Advantage

The Menlo Secure Enterprise Browser Enhances Security with Google Cloud

The post Cirata Data Migrator Accessible Through Google Cloud Marketplace appeared first on Digital IT News.

]]>