Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
10 in Tech hosts Carlos Casanova and Shane Carlson sat down with Michael Ludwig from Blazent at Knowledge17 where they discussed discovery, the CMDB and data quality. Getting the CMBD right is critical to success in a ServiceNow implementation. Listen to this episode to find out how Blazent deals with data quality and the CMDB.
Migrating your infrastructure from on-premises to the cloud requires extensive monitoring to ensure your IT operations don't suffer along the way. For many organizations, garnering this insight can be difficult -- if not impossible. Join Blue Medora's Mike Langdon to learn key best practices to monitor your infrastructure during migration, helping you simplify the process and ensure optimal performance from migration to decommissioning.
Performance is often a key factor in choosing big data platforms. Over the past few years, Apache Spark has seen rapid adoption by enterprises, making it the de facto data processing engine for its performance and ease of use.
Since starting the Spark project, our team at Databricks has been focusing on accelerating innovation by building the most performant and optimized Unified Analytics Platform for the cloud. Join Reynold Xin, Co-founder and Chief Architect of Databricks as he discusses the results of our benchmark (using TPC-DS industry standard requirements) comparing the Databricks Runtime (which includes Apache Spark and our DBIO accelerator module) with vanilla open source Spark in the cloud and how these performance gains can have a meaningful impact on your TCO for managing Spark.
This webinar covers:
Differences between open source Spark and Databricks Runtime.
Details on the benchmark including hardware configuration, dataset, etc.
Summary of the benchmark results which reveal performance gains by up to 5x over open source Spark and other big data engines.
A live demo comparing processing speeds of Databricks Runtime vs. open source Spark.
Special Announcement: We will also announce an experimental feature as part of the webinar that aims at drastically speeding up your workloads even more. Be the first to see this feature in action. Register today!
Today businesses of varying sizes, products and services, and industries are actively applying Predictive Business Analytics in seeking to improve greater customer engagement and resulting increased revenues and profits. Often these businesses are looking to their finance group to guide, if not lead, these efforts to successful outcomes.
In a recent article written by Ray Tong, he cited a CFO magazine survey that:
• Over 70% of respondents said that they plan to substantially increase the use of data analytics to support decision making and improve business partnering;
• 68% of respondents, the majority of who were finance executives, said that they plan to improve their data analytic skills in the coming year.
Consequently, it is essential that the financial professional understands what it is going to take for them to succeed.
This webinar discusses the five (5) keys to applying Predictive Business Analytics in your organization. It provides attendees with specific and practical insights to guide their approaches to develop and deploy an effective process that improves managerial decision making across many core performance and financial areas.
What you will learn:
• What are the key steps to success with practical examples
• What steps to avoid
• How to ensure and gain organizational impact
Larry Maisel a recognized author and thought leader in Business Analytics, with experience assisting companies to improve operating performance and business results. An accomplished business professional, with over 20+ years of experience, with proven leadership skills, expertise to drive bottom-line profits and implementing business systems.
In this technical session, we will learn about the different use cases around Amazon Redshift and discuss how Informatica Intelligent Cloud Services is used to implement these use cases. Through real world examples, you’ll learn about best practices, when to use ETL versus ELT, performance tuning, and new features recently introduced.
Key webinar learnings:
• Amazon Redshift: Key benefits, capabilities, and use case patterns
• Informatica Intelligent Cloud Services for Amazon Redshift and other Amazon services
• Best Practices
• Deep Dive Demo of key use cases
Analytics risks can keep you up at night. What if…
· We make a big investment and don’t break even?
· Management doesn’t trust the results?
· Analysts cross data privacy boundaries?
What a dilemma! You see the perils, yet you want the rewards that analytics can bring. The appropriate process enables you to dramatically reduce risks and maximize returns on your data and analytics investment.
In this presentation, you will learn:
· What causes most analytics failures
· How you can diminish risk and maximize returns through strong analytics process
· Why you (yes, you!) have a pivotal opportunity to establish high standards for analytics process right now
The recent ransomware outbreaks have destabilized business operations around the world.
The most recent ransomware scare came from what appeared to be a new variant of the Petya ransomware. Leveraging exploits and techniques similar to WannaCry, along with other advanced techniques, to cause damage by sabotaging systems, this latest attack clearly demonstrates how damaging malware can and likely will continue to be to organizations.
How can you proactively prepare for such threats?
Watch this webinar to learn how to apply a broader analytics-driven approach to do the fundamentals better, and minimize the risk that your organization will be affected.
This session will include live demonstrations, and will cover best practices in the following areas:
• Security fundamentals – the importance of consistent blocking/tackling and security hygiene
• Posture assessment – establishing end-to-end visibility of potential ransomware activity
• Investigation, hunting and remediation – IR techniques to verify alerts and hypotheses, and prioritize based on risk
• Threat intelligence – identifying C2, file hashes and other ransomware IoCs
• Automation and orchestration – integrating a layered security architecture to drive to faster decisions
• Leveraging machine learning to detect ransomware patterns and adapt threat models for the latest mutations
This case study is framed in a multinational company with 300k+ employees, present in 100+ countries, that is adding one extra layer of security based on big data analytics capabilities, in order to provide net-new value to their ongoing SOC-related investments.
Having billions of events being generated on a weekly basis, real-time monitoring must be complemented with deep analysis to hunt targeted and advanced attacks.
By leveraging a cloud-based Spark cluster, ElasticSearch, R, Scala and PowerBI, a security analytics platform based on anomaly detection is being progressively implemented.
Anomalies are spotted by applying well-known analytics techniques, from data transformation and mining to clustering, graph analysis, topic modeling, classification and dimensionality reduction.
Creating a seamless connected environment that supports smart community citizen services, streamlines operations, supports economic development is already a challenge for community officials. There are many different needs and directions to begin the conversion to an “intelligent” environment. Communities are also planning and building not just for current needs, but also for future connectivity infrastructure that will be used by autonomous vehicles, smart buildings, connected homes, AR/VR, eRetail, eHealthcare, smartgrid and more.
This webcast will discuss such questions as:
> What issues are city officials prioritizing for resolution through smart community applications?
> How are communities planning for and deploying small cell infrastructure?
> Which departments are involved in communications infrastructure?
> How can suppliers navigate the multiple departments involved in decision making?
> What business models are cities negotiating with their technology product partners?
> What are some of the lessons learned from cities that you can translate into your own business offering?
Jascha Franklin-Hodge, CIO, Boston, MA
Kate Garman,Smart City Coordinator, Seattle, WA
Peter Marx, former CTO, City of Los Angeles, currently in the position of VP, GE Digital
10 in Tech hosts Shane Carlson and Kirstie Magowan talked with Brian Uhelski from Accorio at Knowledge17. Brian talked about setting vision and strategy and how important this is to a successful ServiceNow implementation. Do one thing and do it really well. Brian stressed the importance of being involved in the community as this helps customers to make the right decisions.
With Yellowfin you can engage users with governed data they will actually use. Learn how you can deliver timely engaging, and curated analytics with the highest employee usage in the industry.
The Yellowfin BI platform gives IT the control they need, analysts the tools and flexibility they need, and business decision-makers the trustworthy data they need. And it gives developers the flexibility and easy integration they desire.
ServiceNow’s Britt Champeau, chatted with Kirstie Magowan and Shane Carlson from 10 in Tech about champion enablement. She helps organizations spread the value of ServiceNow to the enterprise. Britt concentrates on the ‘soft side’ of a ServiceNow implementations. She emphasizes the value of organisational change management.
10 in Tech’s Kirstie Magowan and Shane Carlson chatted with Chris Pope of Service Now at Knowledge17. There were 15,000 attendees at Knowledge17 and Chris talked about how ServiceNow is sticking to its roots of understanding how people work and what they want and need to do.
Trusted customer data is the difference maker between happy customers and getting blasted on social media. By using our contact data verification tools, you can quickly clean your customer contact data so that it can be relied upon for success. Whether the end goal is improved customer relationships or another data-driven digital transformation objective, clean contact data enables you to engage with your customers more effectively.
During this webinar, you’ll hear from Salema Rice, Chief Data Officer of Allegis Group, on how great customer data has improved her customers' experience. You’ll discover the positive effects right from a CDO and learn about how you can get started in creating better customer outcomes.
With the proliferation of analytics expanding across every function of the enterprise, the need for broader access to data, experienced data scientists and intuitive tools for non-technical users to produce reports and make discovery is growing exponentially.
To meet this growing demand, organizations need to invest in the tools, build processes and enable users to be self-sufficient to avoid bottlenecks while at the same time maintaining the governance and data security standards required to safeguard this critical corporate asset.
This webinar will illustrate how organizations are solving these challenges and enabling users to both access larger quantities of existing data as well as add new data to their own models without negatively impacting the quality, security or cost to store that data. It will also highlight some of the cost and performance benefits achieved by enabling self-service data management.
Join us for this next session of “Under the Hood of Vertica” as we take a deep dive into Vertica’s integration with Apache Spark, an in-memory data processing engine.
Our technical experts will discuss Vertica’s latest support for Apache Spark version 2.1, review common use cases for each, and demonstrate how to leverage Vertica and Spark together to enable continuous, real-time processing and transformation of data streams.
The volume of data streaming into the data center has been growing exponentially for decades. Bandwidth requirements are expected to continue growing 25 percent to 35 percent per year. At the same time, lower latency requirements continue to escalate. As a result, the design of services and applications—and how they are delivered—is rapidly evolving.
Instead of a single dedicated server, information requests coming into the data center are now fulfilled by multiple servers cooperating in parallelThe traditional three-tier network is quickly being replaced by spine-and-leaf networks. As a result, the physical infrastructure must be able to support higher link speeds and greater fiber density while enabling quick and easy migration to new more demanding applications.
This webinar will address:
Solutions, support & decisions
40G or 25G lanes?
Preterminated vs field-terminated cables
Duplex or Parallel transmissions
Singlemode, multimode or wideband multimode fiber?
Attendees with earn one BICSI Continuing Education Credit for attending.
9 months until the GDPR deadline - are you completely up-to-speed?
Our panel of data protection experts will be discussing the compliance considerations that you need to be assessing for May 2018 along with suggesting next steps from a cyber and general security standpoint.
We'll also be asking YOU at what stage you're at in terms of your preparations via a series of interactive benchmarks as we go through the session to get a sense of where the security community is at in terms of preparations.
GDPR and its May 2018 deadline are now fully the minds of the vast majority of security professionals and with massive fines on the horizon for non-compliance, now is a better time than ever to get to grips with the legislation and ensure that your organisation is secure and compliant.
It’s vital that your business has carried out the relevant preparations for compliance by then to make sure you don’t get whacked with a huge fine of up to £15m or 4% of your organisation’s global annual turnover.
Not only are there potentially huge financial repercussions, but leaving your business open to attack and your customers at risk can cause serious reputational damage.
Join the journey towards being data-focused and customer-centric using Big Data and Data Warehouse technologies.
In this webcast, you will understand what it means to take the journey from a data-focused approach and get faster insight without infrastructure concerns.
Learn how to:
- Use Azure public cloud for big data
- Setup a SQL DW, a Hadoop cluster and ask questions against large data sets
- Utilize Microsoft's best in class big data and analytics solutions and how it can power your journey into adopting, analyzing and utilizing big data
Tensorflow is an open source software library for numerical computation and machine learning.
Join this session where Marwa will discuss:
-Introduction to Artificial intelligence, machine learning and deep learning
-Sample of machine learning applications
-Tensorflow Story, Model and windows installation steps with object recognition demo.
Artificial Intelligence (AI) is not a technology for the future; it’s a huge business opportunity for today. But how can your organisation become a trailblazer for AI innovation, transforming the way you work to deliver immediate – and lasting – bottom line value?
Former CERN scientist, Prof. Dr. Michael Feindt, is one of the brightest minds in Machine Learning. Join him for a 30-minute masterclass in how to apply AI to your business.
You’ll learn how AI can:
•Make sense of market and customer complexity, to deliver quick and effective decisions every single day
•Increase workforce productivity to improve output and staff morale
•Enhance decision-making and forecasting accuracy, for operational efficiency and improved productivity
•Be implemented into your business quickly, easily, with minimal disruption
Michael will also share real-life examples of how international businesses are using AI as a transformation tool, from his experience as founder of market-leading AI solution provider, Blue Yonder.
Fraud detection is a classic adversarial analytics challenge: As soon as an automated system successfully learns to stop one scheme, fraudsters move on to attack another way. Each scheme requires looking for different signals (i.e. features) to catch; is relatively rare (one in millions for finance or e-commerce); and may take months to investigate a single case (in healthcare or tax, for example) – making quality training data scarce.
This talk will cover a code walk-through, the key lessons learned while building such real-world software systems over the past few years. We'll look for fraud signals in public email datasets, using IPython and popular open-source libraries (scikit-learn, statsmodel, nltk, etc.) for data science and Apache Spark as the compute engine for scalable parallel processing.
David will iteratively build a machine-learned hybrid model – combining features from different data sources and algorithmic approaches, to catch diverse aspects of suspect behavior:
- Natural language processing: finding keywords in relevant context within unstructured text
- Statistical NLP: sentiment analysis via supervised machine learning
- Time series analysis: understanding daily/weekly cycles and changes in habitual behavior
- Graph analysis: finding actions outside the usual or expected network of people
- Heuristic rules: finding suspect actions based on past schemes or external datasets
- Topic modeling: highlighting use of keywords outside an expected context
- Anomaly detection: Fully unsupervised ranking of unusual behavior
Apache Spark is used to run these models at scale – in batch mode for model training and with Spark Streaming for production use. We’ll discuss the data model, computation, and feedback workflows, as well as some tools and libraries built on top of the open-source components to enable faster experimentation, optimization, and productization of the models.
With old-school device fingerprinting, it’s easy to stop or allow known devices, but with the explosion of the number of new devices and companies engaging customers via multi-channels, the technology falls short in helping you identify transactions that are truly risky or good.
In this webinar, you’ll see how device intelligence with machine learning allows you to derive more accurate fraud and risk insights from large amounts of device engagement data.
Artificial intelligence has greatly changed the way we live since the 20th century. It involves the science and engineering of making machines intelligent and autonomous using computer programs.
The processing power of computers has been on the exponential increase with cost of processors and storage decreasing. This has made research and developments efforts in AI areas such as deep learning, once thought to be impossible possible.
In this webinar, we will examine current methods, application domains of specific methods, their impacts on our daily lives and try to answer questions on ethics of applying these technologies.
If you are working on an AI project, drop it. Chances are good that you don’t have AI problems (yet) but instead have data challenges. In this webinar, we will share what we learned from running a large data platform ingesting 2 million data points/sec and running 7 million queries for hundreds of enterprises. We will focus on the problems that you can tackle on Monday so that you are best positioned for next year and beyond.
About the presenter:
Kiyoto is VP of Marketing at Treasure Data. Previously, he was a Software Engineer at TrialPay and Trader at DRW Trading. He is a self-proclaimed hacker at heart: “creativity and do-ocracy over passivity and bureaucracy any day.”
Intelligent application technology allows you to build apps with powerful algorithms, across platforms, with just a few lines of code. The answer lies in your application's architecture.
Join Pradeep Menon, Microsoft Data Solution Architect, and Karthik Rajasekharan, Microsoft Azure Product Marketing Director to learn:
- The different reference architectures and how to integrate these services into your applications immediately.
- A hands-on approach to implementing Cognitive Services on Azure
- About the next generation of application development and deployment
Often called the ‘Fourth Industrial Revolution’, digitization promises to radically transform manufacturing and supply chains. However, many companies are struggling to move from theory to practice. This webinar (part 5 of our series "The Hitchhikers Guide to IoT" introduces a pragmatic approach developed by Hitachi, which delivers tangible results through the application of IoT, advanced analytics, and artificial intelligence in industrial operations. An example project which Hitachi implemented with a global automotive supplier illustrates the approach, outcomes, and lessons learned. Key points covered on this webinar:
• The opportunity and challenges in manufacturing
• A pragmatic roadmap for digital transformation
• Example: transforming manufacturing at Daicel
• Lessons learned and recommendations
Greg Kinsey is Vice President of Digital Industrial Solutions at Hitachi, leading the global strategy and the EMEA solutions business. He has over 30 years of international experience in manufacturing, technology, and consulting, and is recognized as a leading expert in the field of digital transformation.
When analysis needs to be used by decision makers that didn’t create it, the communication of the information and the message it conveys becomes critical. There is a plethora of ways to layout reports and dashboards, even within a single company.
Enter the SUCCESS formula, that “lightbulb” moment.
Introduced by the IBCS Association (International Business Communication Standards) the SUCCESS formula provides conceptual, perceptual and semantic rules that enable faster, better, and less-costly results in all stages of business communications and decision-making processes.
This webinar will introduce the 7 Rules of SUCCESS that provides a toolkit to aid analysts in designing their visualisations for better reach and decisions in their target audience.
The webinar will also introduce The Philips journey to implementing IBCS principles in their global "Accelerate!” Initiative.
Looking to take your graphs to the next level? Want to make sure you choose the right visualization? Plagued by the challenges of geospatial heat maps?
Get your questions ready and join this session where data experts Carl and Brett will go over the common questions they get asked and answer all the data visualization issues you've been plagued with, including how to:
-Use location-based data to put your visualization on the map
-Uncover new relationships, patterns and opportunities
-Identify emerging trends
-Answering comparative business questions with set analysis
-Understand best practices for creating an aesthetically-pleasing and useful visualization
Predictive Analytics - everyone is talking about it and many organisations claim to be doing it. But are they? And what insights do they gain to then make tactical or strategic changes? How can analysts work with decision makers by sharing results in a visually effective and meaningful way while also informing them about possible courses of action?
This webinar is presented by Andy Kriebel, Head Coach at the Data School and Eva Murray, Tableau Evangelist at Exasol. Our guest speaker on Predictive Analytics is Benedetta Tagliaferri, Consulting Analyst at The Information Lab.
The webinar will look at some examples of predictive analysis and will show data visualization examples that are actionable and can drive further questions and discussions in an organisation.
There has been a flood of publicity around big data, data processing, and the role of predictive analytics in businesses of the future.
As business operators how do we get access to these valuable business insights, even when there is not a data analyst around to walk us through their results?
- Should your software emulate a data scientist?
- Learn about the power of data visualizations.
- Learn about creating value from disperse data sets.