insideBIGDATA Latest News – 10/20/2020

[ad_1]

In this common column, we’ll convey you all the most recent business information centered round our foremost subjects of focus: large knowledge, knowledge science, machine studying, AI, and deep studying. Our business is consistently accelerating with new services being introduced on a regular basis. Fortunately, we’re in shut contact with distributors from this huge ecosystem, so we’re in a singular place to tell you about all that’s new and thrilling. Our huge business database is rising on a regular basis so keep tuned for the most recent information gadgets describing know-how that will make you and your group extra aggressive.

TigerGraph Demonstrates Scalability to Support Massive Data Volumes, Complex Workloads and Real-World Business Challenges

TigerGraph, the scalable graph database for the enterprise, introduced the outcomes of the primary complete graph knowledge administration benchmark examine utilizing practically 5TB of uncooked knowledge on a cluster of machines – and the efficiency numbers show graph can scale with actual knowledge, in actual time. The firm used the Linked Data Benchmark Council Social Network Benchmark (LDBC SNB), acknowledged because the reference customary for evaluating graph know-how efficiency with intensive analytical and transactional workloads. TigerGraph is the business’s first vendor to report LDBC benchmark outcomes at this scale. TigerGraph is ready to run deep-link OLAP queries on a graph of virtually 9 billion vertices (entities) and greater than 60 billion edges (relationships), returning leads to below a minute.

“This benchmark and these results are significant, both for TigerGraph and the overall market. While TigerGraph has multiple customers in production with 10X data size and number of entities/relationships, this is the first public benchmark report where anyone can download the data, queries, and perform the benchmark. No other graph database vendor or relational database vendor has demonstrated equivalent analytical capabilities or performance numbers,” stated Dr. Yu Xu, CEO and founder, TigerGraph. “If there was lingering uncertainty about graph’s ability to scale to accommodate large data volumes in record time, these results should eliminate those doubts. Graph is the engine that enables us to answer high-value business questions with complex real data, in real time, at scale. TigerGraph’s ongoing work in advanced graph analytics has been validated by market recognition, innovative customer applications and continued product evolution – and these benchmark results confirm the company’s position as a clear market leader, succeeding where other vendors have failed.”

NXP Announces Expansion of its Scalable Machine Learning Portfolio and Capabilities 

NXP Semiconductors N.V. (NASDAQ: NXPI) introduced that it’s enhancing its machine studying improvement atmosphere and product portfolio. Through an funding, NXP has established an unique, strategic partnership with Canada-based Au-Zone Technologies to develop NXP’s eIQ™ Machine Learning (ML) software program improvement atmosphere with easy-to-use ML instruments and develop its providing of silicon-optimized inference engines for Edge ML.

Au-Zone’s DeepView™ ML Tool Suite will increase eIQ with an intuitive, graphical consumer interface (GUI) and workflow, enabling builders of all expertise ranges to import datasets and fashions, quickly prepare, and deploy NN fashions and ML workloads throughout the NXP Edge processing portfolio. To meet the demanding necessities of at this time’s industrial and IoT purposes, NXP’s eIQ-DeepViewML Tool Suite will present builders with superior options to prune, quantize, validate, and deploy public or proprietary NN fashions on NXP gadgets. It’s on-target, graph-level profiling functionality will present builders with distinctive, run-time insights to optimize NN mannequin architectures, system parameters, and run-time efficiency. By including Au-Zone’s DeepView run-time inference engine to enrich open supply inference applied sciences in NXP eIQ, customers will have the ability to shortly deploy and consider ML workloads and efficiency throughout NXP gadgets with minimal effort. A key function of this run-time inference engine is that it optimizes the system reminiscence utilization and knowledge motion uniquely for every SoC structure.

“NXP’s scalable applications processors deliver an efficient product platform and a broad ecosystem for our customers to quickly deliver innovative systems,” stated Ron Martino, Senior Vice President and General Manager of Edge Processing at NXP Semiconductors. “Through these partnerships with both Arm and Au-Zone, in addition to technology developments within NXP, our goal is to continuously increase the efficiency of our processors while simultaneously increasing our customers’ productivity and reducing their time to market. NXP’s vision is to help our customers achieve lower cost of ownership, maintain high levels of security with critical data, and to stay safe with enhanced forms of human-machine-interaction.”

Neo4j Announces the First Graph Machine Learning for the Enterprise

Neo4j, a pacesetter in graph know-how, introduced the most recent model of Neo4j for Graph Data ScienceTM, a breakthrough that democratizes superior graph-based machine studying (ML) methods by leveraging deep studying and graph convolutional neural networks.

Until now, few firms outdoors of Google and Facebook have had the AI foresight and assets to leverage graph embeddings. This highly effective and modern method calculates the form of the encircling community for every bit of information inside a graph, enabling much better machine studying predictions. Neo4j for Graph Data Science model 1.Four democratizes these improvements to upend the way in which enterprises make predictions in numerous situations from fraud detection to monitoring buyer or affected person journey, to drug discovery and information graph completion. 

Graph embeddings are a robust device to summary the advanced constructions of graphs and scale back their dimensionality. This method opens up a variety of makes use of for graph-based machine studying.

“We are thrilled to bring cutting-edge graph embedding techniques into easy-to-use enterprise software,” stated Dr. Alicia Frame, Lead Product Manager and Data Scientist at Neo4j. “What we’ve brought to bear with the latest version of Neo4j for Graph Data Science democratizes state-of-the-science techniques, and makes it possible for anyone to use graph machine learning. This is a game-changer for what can be achieved with predictive analysis.”

Sisense Empowers SDL with Data and Analytic Insight to Help Drive Global Business Expansion

Sisense, a number one analytics platform for builders, introduced that SDL has offered all staff with immediate entry to simply consumable buyer and mission analytics utilizing the Sisense platform. This has accelerated data-driven decision-making and the power to handle buyer initiatives for SDL, the clever language and content material firm. The capability to supply nearly real-time visibility into buyer demand holistically was by no means potential earlier than the corporate started utilizing the Sisense platform.

“As a company of our size — and global reach — we’re inundated with data. Sometimes it can be difficult to find simple answers,” stated Marion Shaw, Director of Data and Analytics, SDL. “We’re really delighted with the capability of the Sisense platform; it’s far beyond what we’ve ever had, and its flexibility helps us give the business the answers they need. It’s also constantly evolving, helping us to stay ahead of market demands.”

Fluree Open Sources Its Entire Web3 Data Platform 

Fluree, a market chief in safe knowledge administration, is releasing its core supply code below the AGPL open supply license. Developers can now pull from and contribute to Fluree on Github, in flip constructing a brand new web ecosystem that promotes data-centric safety, traceability, and international interoperability.

“By open sourcing our technology, we reject the status quo practice of locking data up in proprietary format, and instead solidify our commitment to building best-in-class open source solutions to modern data management problems,” stated Fluree Co-CEO Brain Platz. “We are offering enterprises a bridge from vendor lock-in towards a future of complete data ownership, portability, and interoperability.” 

Espressive Barista Understands 1.3 Billion Employee Language Phrases Out-of-the-Box, Bridging Gap Between AI and the Semantics of Human Language

Espressive, a pioneer in synthetic intelligence (AI) for enterprise service administration (ESM), introduced its AI-based digital assist agent, Espressive Barista, now understands 1.Three billion phrases throughout 14 main enterprise companies groups and 9 languages. Barista represents a paradigm shift within the supply of AI-based worker self-help. Powered by a complicated pure language processing (NLP) engine and complicated machine studying capabilities, Barista bridges the hole between AI and the semantics of human language. Espressive additionally introduced that Barista is the primary digital assist agent to combine with Interactive Voice Response (IVR) techniques for enterprise service administration. The new integration reduces assist desk calls by providing direct entry to Barista for workers on maintain, offering solutions in three seconds. With Barista, enterprises can profit from the best assist desk name deflection, whereas growing worker adoption and workforce productiveness.

“Many people question whether AI really understands what it reads. After all, it doesn’t have the common sense to understand human language,” stated Pat Calhoun, founder and CEO of Espressive. “We believe you can solve this issue by bridging the gap between AI and the semantics of human language with enough data, sophisticated technology, and talent. We recognized that the success of Amazon Alexa and Google Home is predicated on a high degree of accuracy due to millions of consumers who use them daily in addition to an army of data scientists, computational linguists, developers, and machine learning engineers tuning the AI engine behind the scenes. So, we designed Barista to replicate that model. I’m proud to announce that today, Barista understands over 1.3 billion enterprise phrases with a high degree of accuracy, and that number grows daily. That’s why our customers experience the highest ticket deflection in the industry.”

RudderStack Launches RudderStack Cloud – Customer Data Platform Built for Developers Offers Key Integrations Including Snowflake and DBT

RudderStack, which gives a Customer Data Platform (CDP) designed particularly for builders, launched the subsequent era of its SaaS providing –  RudderStack Cloud, essentially the most environment friendly, reasonably priced and complicated buyer knowledge product for builders. Offering integrations with platforms reminiscent of Snowflake and DBT, RudderStack Cloud solves the info silo drawback by enabling knowledge engineers to unify their knowledge and add CDP performance on high of their very own warehouse.

Traditional CDPs have tried to unravel for knowledge assortment and activation, however sadly most of them make the issue worse by creating further knowledge silos and integration gaps. Data engineers typically discover themselves caught within the center, solely partially leveraging the facility of instruments like Snowflake and DBT as a result of different elements of the stack don’t combine with their bigger knowledge workflow. RudderStack places builders, their most popular instruments and trendy architectures entrance and heart, serving to knowledge engineers and their firms uncover highly effective new alternatives in the way in which they join these vital techniques and put them to work throughout the group.

“It’s time for a new approach in the way companies architect their customer data stacks and how CDPs fit into the toolset, and that’s exactly what we’re building at RudderStack,” stated Soumyadeb Mitra, CEO of RudderStack. “By enabling developers to turn their warehouse into a CDP, we’re removing data silos, solving security concerns, and making the richest possible data more widely available across the entire organization.”

Treasure Data Provides Game-Changing Analytics for Brands with Launch of Treasure Insights

Treasure Data™ launched new product capabilities for its Customer Data Platform (CDP) that present game-changing analytics to manufacturers. Treasure Data introduced 15 new integrations, bringing the overall variety of connectors in its community to greater than 170. Finally, with this launch Treasure Data additionally launched an in-store SDK (software program improvement equipment) that gives retailers an entire, unified view of the consumers’ journey. 

“Treasure Data empowers businesses to build insights at the speed of customer decisions,” stated Rob Parrish, Vice President of Product, Treasure Data. “Backed by our industry-leading customer data management capabilities, Treasure Data continues to build on its comprehensive solution to further accelerate time to value for our customers.” 

Informatica Announces Advanced Capabilities in Enterprise Cloud Data Management to Help Businesses Swiftly Transform within the Cloud

Informatica, the enterprise cloud knowledge administration chief, introduced new superior capabilities designed to assist prospects quickly turn into cloud-first, cloud-native on this international pandemic. IDC predicts continued double-digit progress in infrastructure digital transformation in 2020 through the pandemic as firms more and more spend money on the cloud to speed up their digital transformation efforts. Informatica has been on the forefront of enterprise cloud knowledge administration, constantly innovating to assist its prospects succeed within the Cloud-AI period.

“Customer-focused innovation with a pulse on the industry is what drives Informatica’s market leadership in enterprise cloud data management,” stated Jitesh Ghai, SVP and GM, Data Management, Informatica. “We have made significant enterprise scale, cloud-native and AI-powered investments in product and platform innovation as shown in our Gartner Magic Quadrant leadership in all five key categories of data management. As businesses transform themselves using cloud analytics to stay competitive amidst a global pandemic, Informatica is well-positioned to help them succeed in the Cloud-native and AI era.”

GoSpotCheck Builds on Google Cloud for Real-Time Activity Tracking for Some of the World’s Biggest Brands

GoSpotCheck, the software program firm reimagining how tomorrow’s workforce works, introduced that it has built-in Looker, the enterprise intelligence (BI) and analytics platform from Google Cloud, to create a platform for constructing custom-made knowledge experiences that speed up enterprise outcomes for its prospects.

GoSpotCheck (GSC) is a cell process administration platform that connects frontline staff with company objectives and directives, creates a shared view of the sector, and helps leaders make higher choices, quicker. By deploying Looker, GSC was in a position to create 225 custom-made knowledge experiences that seamlessly match into present workflows to ship real-time knowledge on the level of want, and scale back the general time wanted to construct stories by 70%. Today, GSC delivers insights 95% quicker to a whole bunch of its high enterprise prospects worldwide, together with Dole, Fruit of the Loom, Save A Lot, and Under Armour.

“A lot of our customers operate in complex ecosystems where they have a lot going on, and with Looker we ensure that data isn’t one of the things they need to worry about. We provide the right amount of data to different levels of users in the enterprise and visualize it in the ways  they want to consume it based on their role or business objectives. Being able to get the right reporting to these different layers of stakeholders provides incredible value to our customers and a serious competitive advantage,” stated Jeff Wrona, VP of Strategic Implementations at GSC.

Kespry Collaborates with Microsoft to Deliver Kespry Perception Analytics for Intuitively Searching and Analyzing Complex Visual and Geospatial Data

Kespry, a number one visible search and analytics answer supplier, introduced the provision of Kespry Perception Analytics. The answer is designed for industrial use circumstances requiring complete evaluation of advanced visible knowledge, together with asset situation monitoring and figuring out business-impacting anomalies.  Kespry Perception Analytics vertically integrates as an ISV answer for the Microsoft Dynamics 365 and Power Platform.

At the guts of Kespry Perception Analytics is a information graph that precisely maps an organization’s whole library of visible knowledge, together with media recordsdata and photogrammetric output, by sorts of bodily belongings, their particular geographic location, and the categories and occasions of points recognized. The platform gives a complete toolset to ingest and index the info, in addition to leverage Microsoft’s Azure machine studying (ML) to generate insights on the info. What differentiates Kespry Perception Analytics is its intuitive search and analytics capabilities that allow reliability and upkeep groups to question knowledge with none coding information. It affords interactive dashboards and knowledge visualization instruments to investigate the well being of belongings throughout the corporate.

“Kespry Perception Analytics delivers unprecedented business insight and solves major problems for industrial companies that have struggled to get meaningful value from visual data in a timely manner,” stated George Mathew, CEO, Kespry. “It provides companies with a more complete view of the state of assets than just depending on telemetry data alone. It’s designed with a simple interface to help users intuitively navigate through complex analysis with ease.”

Privacera Platform 4.0 Automates Enterprise Data Governance Lifecycle

Privacera, a cloud knowledge governance and safety chief based by the creators of Apache Ranger™, introduced the overall availability of model 4.Zero of the Privacera Platform, an enterprise knowledge governance and safety answer for machine studying and analytic workloads within the public cloud. Driven by growing buyer demand, Privacera 4.0’s new options embrace: entry workflows for quicker on-boarding and customised knowledge entry; expanded discovery for seamless knowledge tagging in advanced infrastructures; and an encryption gateway for automated encryption and decryption skills.

“For enterprises to truly maximize the value of their data, they must ensure they know exactly where their sensitive data is located and who has access to it, which can be a very time-consuming and manual process for many,” stated Srikanth Venkat, VP of Product at Privacera. “Privacera 4.0 is a direct response to this need, and we’ve made significant improvements to provide our customers the most seamless experience possible. We’ve made the entire governance lifecycle completely automated for our customers, ensuring they’re protected across even the most complex of infrastructures.” 

Data Analytics Customers Value Choice and Simplicity; Teradata’s New Flexible Cloud Pricing Provides Both

Recognizing that knowledge analytics workloads, utilization patterns, and utilization charges can differ extensively throughout a corporation, Teradata (NYSE: TDC), the cloud knowledge analytics platform firm, introduced versatile cloud pricing choices to make it straightforward for enterprises to develop, and profit from knowledge analytics within the cloud. In retaining with Teradata’s intention to supply its prospects with simplicity and selection, the corporate now affords two versatile cloud pricing fashions: Blended and Consumption. Blended Pricing is finest fitted to excessive utilization and gives the final word in billing predictability whereas delivering the bottom price at scale. Consumption Pricing is an reasonably priced, pay-as-you-go possibility finest fitted to advert hoc queries and workloads with typical or unknown utilization that delivers price transparency for simple departmental chargeback. With broad availability of each fashions, enterprises can count on extra selection, decrease danger, larger effectivity, and better transparency from Teradata. These choices are essential in at this time’s unpredictable market the place applied sciences, provide chains, and buyer expectations can shift abruptly, leaving firms with stranded knowledge analytics investments if their software program fails to supply sufficient flexibility to evolve as wants change.

“If 2020 has taught us anything, it’s that change happens fast, and having simple, flexible cloud pricing options gives customers the freedom needed to optimize their data analytics investments,” stated Hillary Ashton, Chief Product Officer at Teradata. “Different analytic use cases have vastly different utilization patterns at different points in time, which means that having choice in pricing models enables Teradata to offer the best one for each customer scenario ranging from a small ad hoc discovery system to a large production analytics environment.”

Cyxtera Brings Innovative AI/ML Compute as a Service Offering to Federal Market

Cyxtera, a worldwide chief in knowledge heart colocation and interconnection companies, introduced the provision of its landmark Artificial Intelligence/Machine Learning (AI/ML) compute as a service providing for presidency companies needing modern infrastructure to energy AI workloads. The first-of-its-kind providing available in the market leverages the NVIDIA DGX™ A100 system and can be delivered by way of Cyxtera’s Federal companies trade platform, which is licensed as FedRAMP Ready on the High Impact Level, from the corporate’s extremely safe knowledge facilities in Northern Virginia and Dallas-Fort Worth.

The availability of Cyxtera’s AI/ML compute as a service answer gives authorities companies, in addition to their contractors and sub-contractors, better efficiency, agility and fast deployment of infrastructure to assist AI workloads. The providing additionally eliminates the necessity for vital capital outlays and prolonged provisioning cycles usually required for techniques and supporting infrastructure designed to satisfy the wants of presidency AI-related initiatives.

“Bringing the power of the NVIDIA DGX-powered AI/ML compute as a service offering to the government market further enhances Cyxtera’s ability to deliver a robust set of secure, leading-edge infrastructure options to meet the evolving needs of our federal government customers,” stated Leo Taddeo, President of Cyxtera Federal Group and Chief Information Security Officer for Cyxtera. “With Cyxtera’s FedRAMP Ready status at the High Impact Level for on-demand infrastructure and interconnection solutions built for sensitive federal government data, our team is able to provide public sector customers with a leading-edge solution aligned with the federal government’s ‘Cloud Smart’ prioritization in IT modernization efforts.”

Couchbase Advances Edge Computing Innovation with 2.8 Release of Couchbase Lite and Sync Gateway

Couchbase, the creator of the enterprise-class, multicloud to edge NoSQL database, introduced model 2.Eight of Couchbase Lite and Couchbase Sync Gateway for cell and edge computing purposes. Available now, the discharge provides organizations the facility to take full benefit of a distributed cloud structure, creating always-fast, always-on purposes that assure enterprise uptime even in a disconnected computing atmosphere.

“Enterprises of all types are continuing to explore what edge computing has to offer, and we’re starting to approach the point where it reaches its full potential,” stated Ravi Mayuram, Senior Vice President of Engineering and CTO, Couchbase. “By making applications less and less reliant on synchronization with a central server, we’re giving enterprises the tools they need to take full advantage of edge. Regardless of their environment, enterprises can seamlessly spin up new edge deployments as and when they’re needed and take advantage of enhanced data transfer capabilities that make edge applications smarter than ever.”

Machine Learning Comes to MariaDB Open Source Database with MindsDB Integration

MindsDB, the open supply AI layer for present databases, introduced their official integration with the extensively used open supply relational database, MariaDB Community Server. This integration fills a longstanding demand of database customers for the power to convey machine studying capabilities to the database and democratize ML use. MindsDB helps apply machine studying fashions straight within the database by offering an AI layer that permits database customers to deploy state-of-the-art machine studying fashions utilizing customary SQL queries. The use of AI-Tables helps database customers leverage predictive knowledge contained in the database for simpler and more practical machine studying initiatives.

“As MindsDB sets out to democratize machine learning, we’re excited to offer ML capabilities to the MariaDB community,” stated MindsDB co-founder, Adam Carrigan. “MariaDB shares our vision and understands that putting machine learning tools in the hands of the users that know their data best is the most effective way to solve their problems.”

Nexla Launches Nexsets to Reimagine DataOps and Drive Self-Service

Nexla, a converged knowledge cloth, introduced the launch of Nexsets, an modern know-how that makes knowledge operations collaborative and permits knowledge groups to simply enrich, safe, share, and validate knowledge. The Nexla platform applies steady intelligence on knowledge to automate time- consuming engineering duties. The result’s a brand new approach to drive self-service knowledge integration and transformation. With the launch of Nexsets, now enterprise customers have entry to curated knowledge views that make it straightforward to join knowledge with any utility with minimal engineering assist.

Data environments are more and more advanced and dealing with knowledge throughout disparate techniques is difficult. However, integrating knowledge, creating and managing APIs, sustaining safety, and remodeling and making ready knowledge for all kinds of techniques and purposes, all of those actions put a large burden on engineering assets. As a end result, knowledge engineers battle to satisfy the wants of enterprise customers, and in flip, enterprise customers battle to leverage knowledge to maneuver the enterprise ahead.

Upsolver Creates First-Ever Truly Open Cloud Lakehouse, Releases Native Ingestion Connectors To Redshift And Snowflake

Upsolver, supplier of a cloud-native lakehouse engine, introduced the discharge of native ingestion connectors to Amazon Redshift and Snowflake (NYSE: SNOW), creating the first-ever really open cloud lakehouse. Using Upsolver’s platform, enterprises can now simply change between knowledge warehouses and knowledge lake question engines, throughout a number of distributors.

While knowledge warehouses are glorious for enterprise intelligence, they can not tackle all trendy enterprises’ knowledge processing wants, reminiscent of streaming, textual content search, and machine studying. And though knowledge lakes are an economical approach to retailer huge quantities of information, they’re advanced to handle and require costly engineering experience (on-premise and within the cloud). Upsolver’s cloud lakehouse engine empowers organizations to now obtain the fee and adaptability benefits of a knowledge lake mixed with the ease-of-use of a knowledge warehouse. 

“Solutions like Redshift and Snowflake are amazing for making data valuable, but one database cannot solve all use cases,“ said Ori Rafael CEO of Upsolver. “Organizations should be able to leverage multiple database engines and easily switch between them according to their use case, in-house skills, and cost restrictions. This is the vision of the open cloud lakehouse and Upsolver is the engine that powers it.”

Micro Focus Introduces New Data Analytics Capability to Drive Full-Stack AIOps

Micro Focus (LSE: MCRO; NYSE: MFGP) introduced the discharge of its ITOM “Collect Once Store Once” Data Lake (COSO), using an open entry knowledge platform constructed on Vertica to drive full-stack AIOps throughout the broad set of Micro Focus monitoring and automation options. COSO is now an built-in a part of Micro Focus Operations Bridge, Network Operations Management and Data Center Automation. This strategy to offering a full spectrum of reporting and insights throughout multi-domain monitoring, administration and patch compliance is just accessible from Micro Focus.

“The diversity of data available to IT operations today makes it challenging to solve complex issues across multi-cloud and on-premises IT services,” stated Tom Goguen, Micro Focus Chief Product Officer. “COSO now offers a unique collection and storage capability built on Vertica’s powerful, high-speed data analytics platform. Combine COSO with our world-class discovery, monitoring, process automation and patch management tools, and you can have full-stack AIOps today to identify root cause and restore service faster than ever before.”

O’Reilly launches highly effective new device for studying within the stream of labor:
O’Reilly Answers

O’Reilly, the supply for insight-driven studying on know-how and enterprise, introduced the launch of O’Reilly Answers, a complicated pure language processing (NLP) engine that delivers fast, contextually related solutions to difficult technical questions posed by customers via O’Reilly on-line studying. With a one-click integration into Slack, O’Reilly Answers helps customers be taught from and uncover the content material that strikes enterprise ahead.
Leveraging superior machine studying methods, the O’Reilly Answers search engine gives related highlights and snippets from O’Reilly’s library of knowledgeable content material throughout hundreds of O’Reilly’s titles, pointing customers on to solely essentially the most relevant assets and eliminating noise. To encourage deeper discovery, the function permits customers to drill down into full content material items from referenced titles. To additional enhance productiveness, all capabilities of O’Reilly Answers can be found via a easy Slack Integration.

“Over half of all O’Reilly usage is non-linear learning – finding fast solutions that can quickly be applied to work. Taking time to dig up resources can mean the difference between moving to the next step or stalling on a project,” stated Laura Baldwin, President, O’Reilly Media. “As we fall into step with the new pace of organizational change, O’Reilly Answers helps bridge the gap between learning and knowledge, eliminating the need for lengthy training sessions and helping users get back to work with the tools they need to get the job done.”

Sign up for the free insideBIGDATA publication.

[ad_2]

Source hyperlink

Write a comment