Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
„You can´t use it, if you can´t find it” – Heutzutage werden in Unternehmen mehr Daten denn je gesammelt, gespeichert und genutzt. Studien und Umfragen zeigen jedoch, dass lieber gesammelt als genutzt wird.
Woran liegt das? Eine Ursache hierfür ist, dass Unternehmen gar nicht wissen, welche Daten gesammelt werden, wo sie gespeichert werden und wie diese genutzt werden können. Es fehlt an Transparenz und Struktur.
In unserem 45-minütigen Webinar möchten wir Ihnen anhand einiger Praxisbeispiele zeigen, wie Sie mit einem Data Catalog Informationen über alle Datentöpfe hinweg zentral zur Verfügung stellen können. Erfahren Sie, wie unsere Kunden davon profitieren und welche Herausforderungen mit einem Data Catalog gemeistert werden können.
Instaclustr’s Managed Platform simplifies and accelerates the delivery of reliability at scale through open source solutions. By providing Apache Cassandra, Apache Kafka, Elasticsearch, Apache Spark and other leading open source technologies through single environment, Instaclustr make it easy to deliver applications at any scale and, most importantly, operate them at the highest levels of reliability. Instaclustr provides the benefits of open source without the risk and costs of learning to manage it yourself.
This webinar will
- walk through the key features of Instaclustr’s platform and offering;
- illustrate specific features and value-adds for each of the open source technologies offered; and
- provide insight into Instaclustr’s product roadmap for future enhancements.
Data is fueling a new digital economy and compelling companies to rapidly adopt modern technologies such as Machine Learning, AI and Cognitive Science. Consequently, assembling the right blend of data from disparate sources using agile and flexible techniques like logical data warehousing to create purposeful, accessible insights is one of the greatest strategic tasks before us.
To address the challenges associated with advanced analytics solutions, Neudesic uses a best-fit-engineering approach to enable enterprises to utilize the right tools for the right job to maximize their data and analytics strategy. When helping customers construct architectures that surface more data to an ever-growing number of data consumers without the need for data replication, Neudesic looks to Denodo as its tool of choice.
Join Neudesic and Denodo for an interactive webinar to learn how you can apply data virtualization to your advanced analytics strategy for the purpose of achieving growth objectives. You will get to learn:
* Why data virtualization should be part of your advanced analytics strategy.
* How easily your use case will fit one of the numerous architecture patterns Denodo enables.
* How Denodo’s innovative engine offers best of breed data virtualization capabilities, through a product demonstration.
This TechTALK introduces the Operational Data Warehouse (ODW). Pradeep introduces the ODW and the Actian Vector analytics database. Emma McGrattan will provides an overview of key database technology that enable the ODW uses. Pradeep will close with a demonstration demonstration of the Windows edition.
Viewers can ask questions during the event to be addressed during the live Q & A or by email following the event.
Download the Evaluation Edition at https://www.actian.com/lp/try-vector/
As your infrastructure has grown to include a mix of physical, virtual and cloud environments with increased network speeds and volume of data, so have the threats increased to your attack surface with more vectors to breach your organization. This challenges your network and security operation teams and tour traditional network packet broker needs to evolve from providing network visibility to also helping strengthen your security posture. Join Gigamon and (ISC)2 on October 18, 2018 at 1:00PM Eastern where we will examine the acquisition and aggregation of data from your physical, virtual and cloud infrastructure, filtering of traffic to provide the right data to the right tools, transforming your data with masking, header stripping and SSL decryption (TLS1.3) to ensure compliance, threat prioritization by providing context and bridging the gap between NetOps and SecOps.
In order to deliver immediate value back to the business, it’s critical to ensure that an organization’s financial systems are running at full strength, but in most cases, I/O bottlenecks throttle performance and delay analytic outcomes. Vexata and Levyx have collaborated on a joint solution that achieves increased performance with less infrastructure, resulting in a 300% improvement in the price/performance ratio over the industry's next best alternative solution. In this webinar, you’ll learn:
•How to utilize the Levyx low-latency software and Vexata’s NVMe-based storage systems
•Best practices to eliminate bottlenecks for tick-analytics, strategic back-testing, algorithmic modeling, etc.
•Real-world results from customer trials and the recent STAC A3 test benchmarks
•Matt Meinel, Senior Vice President of Solutions Architecture, Levyx Inc.
•Rick Walsworth, VP of Product & Solution Marketing at Vexata
Like all telecommunication giants, Bell Canada relies on huge volumes of data to make accurate business decisions and deliver better services. They use BI tools such as MicroStrategy and Tableau to build dashboards and reports that provide actionable insights on key KPIs.
However, as their data began to reach Big Data proportions, they started facing technological challenges delivering growing business reporting demands. They knew they had to bring in some architectural changes to meet their business team’s requirements of interactive slicing and dicing.
In this session Big Data leadership from Bell Canada will present why they chose OLAP on Hadoop technology to achieve multi-dimensional analytics.
They will discuss:
- Why adopting OLAP on Hadoop was mission critical for their business teams
- How they exponentially increased the volume and time span of the data they could analyze
- How this new architecture delivers fast performance even with large numbers of concurrent users
Join Cory and Rob as they discuss Splunk’s newest announcements with podcast alum Jon Rooney, Vice President Product Marketing for Splunk. In this episode we dig in on Splunk Next and its components. This announcement at Splunk .conf included the ability to use augmented reality, business process flows, natural language processing, Phantom, federated search, data stream processing, and quite possibly the most applauded announcement “Dark Mode". We dig in to the potential of each of these topics and their real world applications with Jon.
With organizations collecting an ever-increasing volume of data, the risk of a data breach or falling foul of a regulator is also increasing. Data security, privacy and protection is fast becoming a “must have” requirement within many data programs.
Organizations are starting to realize that there are potentially great synergies in having a much closer understanding of their data from both a governance and security viewpoint. Add in Artificial Intelligence and automation for remediation, together these capabilities are proving to be significant allies in the continuous battle of cyber-security and enabling Data Governance programs.
This webinar explores how these two worlds can now better understand the role that each has to play, in supporting and protecting their organization.
As part of the Reimagine Data Governance series of webinars, Informatica will demonstrate how having a closer relationship between the worlds of governance and security can enhance existing data use and data security capabilities. And how you, in taking that holistic approach, can provide governed and protected data to achieve key business outcomes.
Join us as we build a complete streaming application with KSQL. There will be plenty of hands-on action, plus a description of our thought process and design choices along the way. Look out for advice on best practices and handy tips and tricks as we go.
This is part 2 out of 3 in the Empowering Streams through KSQL series.
Real-time recommendations are at the core of digital transformation in any business today. Whether you’re building features such as product, content or promotion recommendations, personalised customer experience, or re-imagining your supply chain to meet growing customer demands, you’re facing challenges that require the ability to leverage connections from many different data sources, in real-time. There’s no better technology to meet these challenges than a native graph database technology such as Neo4j.
This webinar will cover the fundamentals of building recommendation engines with a graph database. We will discuss typical architectures, give a demonstration of Neo4j in action, and go over some of our top use cases of recommendation engines for companies such as Walmart, eBay, and more.
As more organizations adopt a cloud-first strategy, the task of migrating high-volume transactional workloads presents a unique set of challenges, particularly in handling the large amounts of data involved. Join Primitive Logic and Actifio as we discuss the most pressing challenges around transactional data migrations … and the solutions that can help address them.
You will learn:
The unique challenges in migrating transactional data to the cloud
How to handle data for applications with both on-prem and cloud components
How to approach transactional data as part of a multi-cloud strategy
How data virtualization helps resolve issues of security, governance, multi-cloud coordination, and more
Even with a cloud-first strategy, enterprise IT is increasingly concluding that there will always be an on-premises component, leading them to the conclusion that the hybrid cloud is the only long-term end state. This presentation focuses on the data aspect of hybrid cloud. Storage is foundational to computing. The same statement is just as true for hybrid cloud computing. This presentation looks briefly at the data aspect by identifying use cases (DR, data protection, archive, etc.) and then looks into how users are implementing and managing hybrid cloud storage.
As companies become more data driven, it’s not enough to deliver the same standard SQL queries. Instead, data teams today are transforming SQL results into models to deliver more value to their businesses. But making that jump is tough, it’s hard to determine where to get started in the complex field of modeling and regression analysis.
The Periscope Data community has been buzzing with requests to demystify Python and R modeling so data professionals can boost their skill set. Get started by creating a linear regression model with Neha Kumar, a customer solutions engineer at Periscope Data, and start bringing new analytics value to your business immediately.
Join Neha on Wednesday, October 17th, 2018 at 11 a.m. PDT as she walks through:
- What a linear regression is and when to use it
- Setting up the framework for a linear regression
- Step-by-step creation of a linear regression model in Python and R
- Contextualizing a model for other business teams
Link to the raw data file - https://community.periscopedata.com/t/h42dp1
Join the latest webcast in our Data Disruptors Series to see how Namara - a data management platform - uses collaboration and role management as fundamental pillars to give businesses greater visibility over structured data across multiple data silos, as well as access to the wider world of public open data.
On this webcast you’ll hear from Brendan Stennett, Co-Founder & CTO of Toronto-based ThinkData Works on how the Namara platform:
• Provides fine-grained access control to the right people in an organization so they can access the data they need to fuel the products and applications that are driving their business forward
• Uses a proprietary data ingestion pipeline to make on-boarding data in a variety of formats a fast and painless experience
• Leverages a unique data query engine to make integrating data into apps easy and flexible
• Delivers world class query performance, powered by Vertica
GemFire GraphQL (G2QL) is an extension that adds a new query language for your Apache Geode™ or Pivotal GemFire clusters allowing developers to build web and mobile applications using any standard GraphQL libraries. G2QL provides an out-of-the-box experience by defining GraphQL schema through introspection. It can be deployed to any GemFire cluster and serves a GraphQL endpoint from an embedded jetty server, just like GemFire’s REST endpoint.
We will be demoing G2QL using a sample application that can read and write data to GemFire and share data between applications built using GemFire client APIs, showing you:
- How to use GraphQL to query and mutate data in GemFire
- How to use open-source GraphQL library to build web and mobile applications using GemFire
- How to use GraphQL to deal with object graphs
- How G2QL can simplify their overall architecture
APIs have become the connective tissue for the digital enterprise. From app development, to open APIs, to API-led integration, the adoption of best practices and success of your API program is a key indicator of your strategic success as a digital enterprise.
Join us for wide-ranging discussion with Cox Automotive covering API program best practices, distributed API management across user groups, API-led integration, and the evolution to event-driven architecture that is driving better connectivity with Cox’s business units and strategic partners.
We’ll also cover:
-The ABCs of APIs
-The API program lifecycle
-Use case demos
-Visualized API performance analytics
Can your organization react to customer events as they occur?
Can your organization detect anomalies before they cause problems?
Can your organization process streaming data in real time?
Real time and event-driven architectures are emerging as key components in developing streaming applications. Nearly half of organizations consider it essential to process event data within seconds of its occurrence. Yet less than one third are satisfied with their ability to do so today. In this webinar featuring Dave Menninger of Ventana Research, learn from the firm’s benchmark research about what streaming data is and why it is important. Joanna Schloss also joins to discuss how event-streaming platforms deliver real time actionability on data as it arrives into the business. Join us to hear how other organizations are managing streaming data and how you can adopt and deploy real time processing capabilities.
In this webinar you will:
-Get valuable market research data about how other organizations are managing streaming data
-Learn how real time processing is a key component of a digital transformation strategy
-Hear real world use cases of streaming data in action
-Review architectural approaches for adding real time, streaming data capabilities to your applications
A complete view of the customer encompasses a single view of the customer, the customer’s relationships, and the customer’s transactions and interactions, resulting in increased customer satisfaction and retention as well as increased revenue through up-sell and cross-sell opportunities.
Attend this session to learn how data virtualization supports complete view of the customer by:
- Facilitating integration of master data with any other data throughout the enterprise
- Providing real-time data access to the complete customer view, for any individual or organization across the company.
- Reducing data replication and its associated costs and risks and providing access to cross-company data.
- DV benefits for Complete View of Customer
- Product Demonstration
- Summary & Next Steps
Enterprise IT is facing huge volumes of new data that is overwhelming legacy SAN solutions. Rather than get trapped with the outcomes of an aging legacy system, learn how you can leave slower data access, increased costs, and the inflexibility of a legacy infrastructure behind. Join the webcast “Demanding Workloads Demand NVMe” and find out how moving to an end-to-end NVMe infrastructure can clear your path toward data center modernization.
You will hear directly from three leading experts: Andrew Grimes from NetApp, Eric Burgener from IDC, and Naem Saafein from Brocade.
They will discuss how you can:
• Speed time to innovation by maximizing your network resources with the latest all-flash storage technologies
• Become a key part of your company’s digital transformation and future success
• Easily move from a siloed solution to a cloud-connected all-flash infrastructure
Jupyter notebooks are transforming the way we look at computing, coding and problem solving. But is this the only “data scientist experience” that this technology can provide?
In this webinar, Natalino will sketch how you could use Jupyter to create interactive and compelling data science web applications and provide new ways of data exploration and analysis. In the background, these apps are still powered by well understood and documented Jupyter notebooks.
They will present an architecture which is composed of four parts: a jupyter server-only gateway, a Scala/Spark Jupyter kernel, a Spark cluster and a angular/bootstrap web application.
Increased workload coupled with an industry-wide shortage of skilled responders is a common challenge heavily impacting operational performance in Security Operations Centers globally. An integral part of the solution is formulating a methodology to ensure that crucial knowledge is held and transferred between incident responders at all levels and overall retained within the organization.
By utilizing Security Orchestration, Automation and Response (SOAR) technology, security teams can combine traditional methods of knowledge transfer with more modern techniques and technologies by incorporating machine learning and artificial intelligence capabilities.
Join this webinar to learn about the benefits of implementing a SOAR solution, such as IncMan SOAR from DFLabs, and see how we can help to ensure that your organization’s knowledge is consistently and accurately retained, used and transferred, while simultaneously contributing to the efficiency and effectiveness of your entire incident response process.
- The benefits of using SOAR technology
- How to overcome the shortage of skilled security operations staff
- How security orchestration and automation can facilitate knowledge transfer
- How a SOAR solution can improve your overall security program performance
A small team of data experts collaborated to collect, analyze and visualize Twitter data from the UN General Assembly. In this webinar we will take you through the data collection, preparation, transformation and loading process and show a live demo of the Alteryx workflow that was used.
Join us, ask questions during the webinar and see how you could easily build a data pipeline to collect information about a specific event or initiative.
This webinar is targeted at any level of expertise, we will explain concepts, processes and individual steps taken with the different tools.
Multi-Cloud or hybrid environments has become a growing strategy for many organisations moving their applications and data in the cloud as it gives organisations the ability to be flexible in what they need for their business. In fact, according to recent research by Computing, 26 per cent of enterprises surveyed were using a multi-cloud approach compared with 22 per cent last year. However, along with this approach one thing organisations should not forget is that the need for security does not diminish. As the cloud continues to offer more in services, organisations need to be aware of the protection they are responsible for to ensure their applications and workloads are safe.
So how do organisations stay ahead of the curve with security as well as being able to manage these multi-cloud or hybrid environments?
Join our latest webinar where, Chris Hill, VP Public Cloud EMEA at Barracuda, and Alex Hilton, Chief Executive and Cloud Industry Forum (CIF)will discuss the latest challenges when it comes to migrating to the cloud, especially in a multi-cloud environment as well as analyse CIFs latest research findings on what organisations need to consider when adopting these cloud services.
As an enduring leader in master data management, Informatica continues to innovate, anticipate market needs, deliver new functionality and enhancements to organizations of all sizes. The newest capabilities delivered in Multidomain MDM 10.3 and MDM – Customer 360 10.3 make it easier for the business user to manage, curate, and consume master data. This includes:
•Configurable, secure and feature-rich user interfaces for managing master data
•Improved search and query experiences, more workflow configuration options, and new capabilities to support compliance use cases
•Improved business user interface to make it easier to work with different mastered domains, hierarchies, and stewardship
•Increased efficiency for data stewards through ad hoc match, wizard-driven reports and charts, and more
Join Oliver Soans, Principal Product Manager, MDM Solutions to learn how Informatica is simplifying the complex art of master data management to deliver business value faster. You will also see a demo of the product.
There’s a lot of hype around data-driven business. And as enterprises face the coming wave of Machine Learning and Artificial Intelligence, there’s also a lot of confusion about how to prepare your data, infrastructure and organization for this work.
In this webinar, Donald Farmer of TreeHive Strategy, sets out 5 questions you can ask of your business to assess if you are making the most of your current analytics, and if your roadmap for the future will be effective. We’ll consider the tools you use, the skills you need in organization and the data pipeline you need to build for governance and flexibility.
Donald’s approach covers both strategy - What do you need to achieve for your business? - and tactics - How will you achieve it? CIOs, IT managers and anyone working in analytics and data management will find it insightful and helpful.
Мы расскажем о преимуществах сетевой фабрики для модульной серверной платформы PowerEdge MX. Мы подробно рассмотрим, как устроена сетевая фабрика, из каких элементов состоит, как масштабируется, управляется и какие варианты подключений предлагает.
Sensitive information about individuals can be recovered from different types of data releases, including aggregate statistics or machine learning models. This session will address the privacy risks in publishing analysis results and introduce data privacy techniques to defend against them.
Theresa Stadler, Data Scientist at Privitar, will explain differencing and reconstruction attacks on simple summary statistics such as count tables, along with discussing the privacy risks of supervised machine learning.
Some of the takeaways of the session include:
- Reasons to be concerned about the privacy of training data
- The attacks on machine learning models that can occur and what private information about individuals in the training data can be recovered
- A simple example of a black-box privacy attack on a classifier, a common machine learning model
- An introduction to the differential privacy framework that functions as a privacy-enhancing technology to defend against the attacks introduced
AI attracts a lot of attention today and the advances are usually driven by machine and/or deep learning. However the tools that most use in this space require specialist (technical & data) skills. Fortunately a new generation of tools are emerging that aim make AI available to all. Im a big believer and supporter of this trend. In this session I take you through the tools and techniques that are making AI capabilities more easily available to billions of ordinary folk worldwide.
Doug Merritt, CEO of Splunk, sat down with the Big Data Beard to talk about the exciting announcements coming from conf2018 as well as how he is leading Splunk's massive growth while retaining their incredible culture and focus on using their powers for good.
The capability of finding patterns in data and turn that into actionable insights, to predict the future state, is the key to the competitive advantage for any organization. This can drive better strategies. Today, every CEO wants the business problems to be solved with the use of automated Artificial Intelligence/Machine Learning framework. But, whether this can be achieved and to the degree that one can, depends on various things such as vision, commitment and involvement to name a few. This is the question pondered on by the senior executives. In this presentation, our goal is to help organizations understand the best practices and frameworks/tools of Artificial Intelligence in general, Machine Learning and Deep Learning in particular thus help you think about how you and your organization may stay relevant and successful in the years and decades to come.
To borrow a phrase from a popular song from REM, “It’s the end of the LUN as we know it and I feel fine”. VMware VVols changes everything we know about storage for vSphere in a good way, with VVols LUN management is a thing of the past. VMware VVols represents the future of external storage for vSphere and that future is here right now. VVols also represents many years of engineering work by both VMware and its storage partners. The result of that work is a new storage architecture for vSphere that solves many of the hidden complexities inherent in VMFS and levels the playing field between file and block protocols. Learn from experts at HPE & VMware how VVols transforms external storage in vSphere, eliminates complexities and provides very real benefits to customers.
Manufacturing is going through a deep transformation, with changes that all are all about digitalization. With Industry 4.0, digitalization is more visible and disruptive in manufacturing, expanding virtual data and processes in an industry that is fundamentally about dealing with physical products.
This transformation has already started and its impact is expected to be massive, with changes across the whole manufacturing ecosystem technically, economically, and socially. Key elements of the Industry 4.0 transformation include 3D printing, robotizing and automation, smart factory with IoT and machine learning, and supply chain digitization.
Now, blockchain is becoming a key technology driving this digital revolution.
Join us for this webinar hosted by some of our best blockchain subject matter experts to learn:
● The impacts of technology on the manufacturing industry
● Why blockchain is a valuable technology for next-generation manufacturing use cases
● How blockchain is applicable to different areas of manufacturing
The next generation of BI technology will rely heavily on machine intelligence – and how that intelligence is used matters. The best solutions will support human-centered analysis, giving users at any skill level the ability to gather, visualize, and analyze data. Join this webinar to hear The Eckerson Group and Qlik explain how augmented intelligence is making analytics even more accessible to users.
From cloud computing, to blockchain and open APIs, there is a plethora of new technologies that are revolutionising the banking and payments ecosystem.
Join this interview where Mihail Dula, Head of Product Management for Americas Payments at Finastra will discuss:
-How is the payments landscape evolving and what’s driving this change?
-Where is there opportunity for banks?
-How are emerging technologies such as cloud, RTP, machine intelligence, APIs and blockchain impacting payments and creating opportunity?
-It seems that bank perception around cloud-based payment services is shifting, with more banks being open to cloud as part of their infrastructure. In terms of options available, what has changed regarding payments technology?
-Security, regulatory compliance, cost and risk were major concerns of cloud. Have perceptions changed in the industry? How have these concerns been addressed?
-Likewise, cloud computing is accelerating innovation and adoption of emerging technologies. What are the benefits and opportunities of cloud based offerings?
This webinar is part of BrightTALK's What's Big in BI series.
What is the moral responsibility of a data team today? As artificial intelligence and machine learning technologies become part of our everyday life and as data and big data insights become accessible to everyone, CDOs and data teams are taking on a very important moral role as the conscience of the corporation.
In this episode of What's Big in BI, Harry Glaser highlights the risks companies will face if they don't empower data teams to lead the way for ethical data use across a variety of functions including business intelligence, analytics, big data initiatives and more.
Harry founded Periscope Data in 2012 with co-founder Tom O’Neill. The two have grown Periscope Data to serve more than 1000 customers. Glaser was previously at Google, and graduated from the University of Rochester with a bachelor’s degree in computer science.
In this webinar the audience will learn how Machine Learning and AI is being adopted in the real-world working with Google.
Google uses 4000+ machine learning models to run everything from their search engine, to Gmail and much more. The same infrastructure is now being used across multiple industries like supply chain, retail, manufacturing, and health care, to solve a myriad of problems like:
- Improving the taste of beer
- Beating humans in forecast accuracy
- Driving delivery of fresher fruits and helping online retailers reduce inventory carrying costs by over 50%
- and much, much more.
All this is being accomplished in ways that were never imagined before using machine learning and AI technology.
In this webinar we will walk you through how customers of all sizes are going through this digital transformation and how these results signal a huge wave ahead for businesses worldwide.
Manju Devadas is CEO of Pluto7, a Google “Global Breakthrough Partner of the Year” finalist company focused on Machine Learning and AI. At Pluto7, Manju utilizes his 17+ years of experience in predictive analytics to transform business on Google Cloud Provider (GCP). He believes that the next 10 years will be the age of machines and companies like Google and Pluto7 are early adopters.
Learn how to drive business value with Machine Learning using Feature Engineering including feature extraction and feature selection to generate insights that resonate with business users. The complete process of Feature Engineering will be discussed from data wrangling through cleaning, exploring, modelling, validating, analyzing, and visualization. Specific business use cases will be presented to clearly convey how to apply Machine Learning and Feature Engineering to create accurate and interpretable predictive models.
WHO SHOULD ATTEND?
· CTO – Understand risk-averse Cloud based Machine Learning
· CIO - Support business operations with automated Feature Engineering
· Analytics Directors/Managers – Implement predictive analytics solutions fast
· Line-of-Business Managers – Identify business needs ripe for Machine Learning
· Data Scientists – Learn how TenPoint7 has implemented Feature Engineering.
WHAT WILL YOU LEARN?
· What is Feature Engineering and how to apply it to business use cases
· Ability to identify areas in your business where Feature Engineering can shine
· How to greatly reduce the time and cost to implement Machine Learning solutions that include data scientists, data engineers, ML platform, and the application all in a single subscription.
As engineers work on changes to migrate architectures or improve performance, it is critical to invest in risk mitigation and track the success of these projects. Feature flags and experimentation provide the data needed to assess the success of architectural migrations with statistical accuracy. Experimentation ties engineering metrics back to the architectural changes that are impacting them.
Operational Insights is a Cloud-based Application that provides visibility into the performance and operational efficiency of Informatica assets across the enterprise. This webinar will introduce a new product offering - Operational Insights for Informatica Big Data Management. Now you can better understand big data cluster resource utilization of Informatica jobs, analyze mapping/workflow executions, manage capacity, and troubleshoot issues in minutes. Includes product demos.
Join this live panel streamed from Money20/20 Vegas where our speakers will discuss:
-Challenges of enabling a successful omnichannel payment ecosystem
-Platforms, APIs, and infrastructure
-Are we getting closer to a cashless society?
-Optimising the eCommerce experience
-The importance of payment security
-Mobile, cross-border, and contactless payments
-Next generation commerce and retail
Massive scalability is one thing, but what if you want to run Object-Based Storage for a smaller shop? Can you start with a small hardware investment and run object-based storage on just one server? John Bell, Sr. Consultant, and Jamshid Afshar, Caringo Engineer, will explain how you can store, manage, search and deliver data with just one server, while maintaining the ability to scale out by simply plugging in additional servers as your data storage needs grow. They will explain how this “pay-as-you-grow” model can benefit organizations as they start to outgrow traditional SAN, NAS and Tape storage solutions.
Building data science solutions for business does not solely rely on the intellectual capacity of data scientists. Besides leveraging their mathematical and analytical skills on models development, data science teams need to effectively address the hardship of:
- working with big data sources and unstructured data,
- spending too much time on data processing tasks,
- deploying the final solution to production and automation.
In this talk, we discuss the importance of tools for ETL/ELT and analytics orchestration for unleashing the full potential of data science teams.
The artificial intelligence revolution is being enabled by the rapid increase in data, computing power in the cloud, and hardware accelerators like GPUs and algorithm innovations. However, only < 5% of organizations have adopted or deployed AI in their operations and > 80% are gathering knowledge and prototyping.
In this webinar, we identify the challenges of transitioning from Prototype to Production. We break them down into four categories of challenges:
1) Code portability,
3) Hyperparameter optimization or Automated machine learning, and
4) Model Deployment.
In this talk, we will address challenges 1 & 2 using RocketML to aid in building production-grade AI applications. These techniques will not only improve data scientists' productivity, but will also reduce total cost of ownership (TCO) compared to market incumbents.
The webinar is hosted by Fundoo.info and moderated by IITAGH Board Member.