Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
10 in Tech’s Kirstie Magowan and Shane Carlson chatted with Chris Pope of Service Now at Knowledge17. There were 15,000 attendees at Knowledge17 and Chris talked about how ServiceNow is sticking to its roots of understanding how people work and what they want and need to do.
Trusted customer data is the difference maker between happy customers and getting blasted on social media. By using our contact data verification tools, you can quickly clean your customer contact data so that it can be relied upon for success. Whether the end goal is improved customer relationships or another data-driven digital transformation objective, clean contact data enables you to engage with your customers more effectively.
During this webinar, you’ll hear from Salema Rice, Chief Data Officer of Allegis Group, on how great customer data has improved her customers' experience. You’ll discover the positive effects right from a CDO and learn about how you can get started in creating better customer outcomes.
Join us for this next session of “Under the Hood of Vertica” as we take a deep dive into Vertica’s integration with Apache Spark, an in-memory data processing engine.
Our technical experts will discuss Vertica’s latest support for Apache Spark version 2.1, review common use cases for each, and demonstrate how to leverage Vertica and Spark together to enable continuous, real-time processing and transformation of data streams.
With the proliferation of analytics expanding across every function of the enterprise, the need for broader access to data, experienced data scientists and intuitive tools for non-technical users to produce reports and make discovery is growing exponentially.
To meet this growing demand, organizations need to invest in the tools, build processes and enable users to be self-sufficient to avoid bottlenecks while at the same time maintaining the governance and data security standards required to safeguard this critical corporate asset.
This webinar will illustrate how organizations are solving these challenges and enabling users to both access larger quantities of existing data as well as add new data to their own models without negatively impacting the quality, security or cost to store that data. It will also highlight some of the cost and performance benefits achieved by enabling self-service data management.
The volume of data streaming into the data center has been growing exponentially for decades. Bandwidth requirements are expected to continue growing 25 percent to 35 percent per year. At the same time, lower latency requirements continue to escalate. As a result, the design of services and applications—and how they are delivered—is rapidly evolving.
Instead of a single dedicated server, information requests coming into the data center are now fulfilled by multiple servers cooperating in parallelThe traditional three-tier network is quickly being replaced by spine-and-leaf networks. As a result, the physical infrastructure must be able to support higher link speeds and greater fiber density while enabling quick and easy migration to new more demanding applications.
This webinar will address:
Solutions, support & decisions
40G or 25G lanes?
Preterminated vs field-terminated cables
Duplex or Parallel transmissions
Singlemode, multimode or wideband multimode fiber?
Attendees with earn one BICSI Continuing Education Credit for attending.
9 months until the GDPR deadline - are you completely up-to-speed?
Our panel of data protection experts will be discussing the compliance considerations that you need to be assessing for May 2018 along with suggesting next steps from a cyber and general security standpoint.
We'll also be asking YOU at what stage you're at in terms of your preparations via a series of interactive benchmarks as we go through the session to get a sense of where the security community is at in terms of preparations.
GDPR and its May 2018 deadline are now fully the minds of the vast majority of security professionals and with massive fines on the horizon for non-compliance, now is a better time than ever to get to grips with the legislation and ensure that your organisation is secure and compliant.
It’s vital that your business has carried out the relevant preparations for compliance by then to make sure you don’t get whacked with a huge fine of up to £15m or 4% of your organisation’s global annual turnover.
Not only are there potentially huge financial repercussions, but leaving your business open to attack and your customers at risk can cause serious reputational damage.
Join the journey towards being data-focused and customer-centric using Big Data and Data Warehouse technologies.
In this webcast, you will understand what it means to take the journey from a data-focused approach and get faster insight without infrastructure concerns.
Learn how to:
- Use Azure public cloud for big data
- Setup a SQL DW, a Hadoop cluster and ask questions against large data sets
- Utilize Microsoft's best in class big data and analytics solutions and how it can power your journey into adopting, analyzing and utilizing big data
Overcome Performance Challenges in Building Spark Applications for AWS is a webinar presentation intended for software engineers, developers, and technical leads who develop Spark applications for EMR or EC2 clusters.
In this webinar, Vinod Nair will show you how to:
Identify which portion of your application consumes the most resources
Identify the bottlenecks slowing down your applications
Test your applications against development or production workloads
Significantly reduce troubleshooting issues due to ambient cluster conditions
This webinar is followed by a live Q & A. A replay of this webinar will be available within 24 hours at https://www.pepperdata.com/resources/webinars/.
As storage technologies continue to evolve in 2017, so do the challenges faced in integrating conventional storage along with new age scalable object storage. Caringo Product Manager Glen Olsen will show how contemporary storage solutions are enhancing traditional storage and protocols so they benefit from modern cost-efficient scalability and data protection.
The problem with data centers is that they generate a lot of events – from low-level disk warnings to critical network issues to service-level failures. While you can create manual rules to filter out some of the noise, this often isn’t enough. How do you navigate this flood of events and prioritize the ones that truly matter? How do you relate these critical events to the same core issue, and ensure everyone has the same understanding of the impacted service, relative priority and likely root cause?
Watch this webinar to learn how you can use Splunk IT Service Intelligence and machine learning to:
- Monitor real-time data and be alerted when an anomaly occurs
- Automatically correlate data to generate highly qualified information, so you can take fast action
- Enable users to investigate with context, priority and importance, so they are empowered to take actions faster
This webinar will focus on the cultural shift from tightly controlled business networks of yesterday to the converged fabric adopted by businesses today. BYOD is becoming a normality for most organisations and it doesn't have to be a heavy burden for security teams with the right policies, people and technology in place. We'll dive into some of the options available for these challenges in this webinar and how having the right BYOD strategy can play an integral role in an organisation's preparation for EU GDPR compliance.
- The security options available today to enable an efficient and safe BYOD strategy
- How implementing a strong BYOD strategy can help compliance
- How you can reduce the risk of suffering a damaging cyber-breach
The term “Software-Defined" sounds like marketing hype but this technology is behind much of current IT systems’ advanced features and functionality - really. And, it didn’t start with software-defined storage. In this webinar Evaluator Group Sr. Analyst, Eric Slack, explains where software-defined came from, how it’s shaping IT infrastructure and what you can expect from this technology.
Most organizations continue to struggle when balancing tradeoffs between enabling IT services -- at low cost and low overhead -- versus staying ahead of modern threats.
Children’s Discovery Museum of San Jose (CDM) needed to deliver quality educational experiences, while protecting their messaging platform from spear phishing attempts and emerging ransomware mutations -- in a highly scrutinized, budget-restricted environment.
Watch this webinar to learn how CDM is using Splunk software to realize the benefits of a sound data strategy resulting in faster, better analytics-driven decisions. CDM has been able to address security, manageability and a variety of operational IT challenges.
In this session, learn how CDM prepares for persistent and emerging malware threats, such as ransomware, via an adaptive approach, and by applying techniques such as:
- Identifying, classifying and automatically blocking statistically-identified threats
- Using primary network data as a key data source within their messaging system
- Leveraging active data pipelines and autonomous, active counter-measures
- Testing for relevance to stay ahead of evolving attack methods
Today, the majority of big data and analytics use cases are built on hybrid cloud infrastructure. A hybrid cloud is a combination of on-premises and local cloud resources integrated with one or more dedicated cloud(s) and one or more public cloud(s). Hybrid cloud computing has matured to support data security and privacy requirements as well as increased scalability and computational power needed for big data and analytics solutions.
This webinar summarizes what hybrid cloud is, explains why it is important in the context of big data and analytics, and discusses implementation considerations unique to hybrid cloud computing.
Network threats and data breaches continue to grow in number, sophistication and speed, overwhelming current defensive capabilities. Security teams, limited in staff, resources and time, suffer from diminished effectiveness and enterprise protection. To stay ahead, organizations must create an adaptive ecosystem of network defenses; much like the body leverages its immune system. A Defense Lifecycle Model speeds threat identification and mitigation by incorporating machine learning and artificial intelligence into these security processes. Join Gigamon and (ISC)2 on August 10, 2017 at 1:00PM Eastern for a discussion on automated prevention, detection, prediction and containment and how it can help to fortify your defense.
Data science and machine learning tools traditionally focus on training models. When companies begin to employ machine learning in actual production workflows, they encounter new sources of friction such as sharing models across teams, deploying identical models on different systems, and maintaining featurization logic. In this webinar, we discuss how Databricks provides a smooth path for productionizing Apache Spark MLlib models and featurization pipelines.
Databricks Model Scoring provides a simple API for exporting MLlib models and pipelines. These exported models can be deployed in many production settings, including:
Will 5G herald a complete shake up of the broadcast and media industry?
Watch now on demand to explore the implications and potential of 5G from a user and supplier perspective and gain technical and business takeaways for your 5G strategy.
Darko Ratkaj, Senior Project Manager for Technology & Innovation, EBU
Phill Lawson-Shanks, Chief Architect & Vice President of Innovation, EdgeConneX
Gianluca Noya, Managing Director Network Mobile Services, Accenture
At Tableau we help people see and understand data. Seven words that drive everything we do. And they’ve never been more relevant. Tableau is all about making your analytics faster, smarter, and more powerful, so that everyone can get the answers they need. Helping people gain insight into their data to solve unexpected problems is what drives us.
Tableau is a visual analytics and reporting solution that connects directly to R, Python, and more. It’s designed for you, the domain expert who understands the data. Its drag-and-drop interface allows you effortlessly connect to libraries and packages, import saved models, or write new ones directly into calculations, visualizing them in seconds.
In this webinar, we will explore how various analytics partners are leveraged in Tableau, and how to take advantage of these integrations to move your analysis to the next level. Whether you work with R, Python, or other statistical or data mining environments, Tableau allows you to take advantage of your existing investments and knowledge to compose impactful data stories.
Creating a seamless connected environment that supports smart community citizen services, streamlines operations, supports economic development is already a challenge for community officials. There are many different needs and directions to begin the conversion to an “intelligent” environment. Communities are also planning and building not just for current needs, but also for future connectivity infrastructure that will be used by autonomous vehicles, smart buildings, connected homes, AR/VR, eRetail, eHealthcare, smartgrid and more.
This webcast will discuss such questions as:
> What issues are city officials prioritizing for resolution through smart community applications?
> How are communities planning for and deploying small cell infrastructure?
> Which departments are involved in communications infrastructure?
> How can suppliers navigate the multiple departments involved in decision making?
> What business models are cities negotiating with their technology product partners?
> What are some of the lessons learned from cities that you can translate into your own business offering?
Jascha Franklin-Hodge, CIO, Boston, MA
Kate Garman,Smart City Coordinator, Seattle, WA
Peter Marx, currently in the position of VP, GE Digital and former CTO, City of Los Angeles
This case study is framed in a multinational company with 300k+ employees, present in 100+ countries, that is adding one extra layer of security based on big data analytics capabilities, in order to provide net-new value to their ongoing SOC-related investments.
Having billions of events being generated on a weekly basis, real-time monitoring must be complemented with deep analysis to hunt targeted and advanced attacks.
By leveraging a cloud-based Spark cluster, ElasticSearch, R, Scala and PowerBI, a security analytics platform based on anomaly detection is being progressively implemented.
Anomalies are spotted by applying well-known analytics techniques, from data transformation and mining to clustering, graph analysis, topic modeling, classification and dimensionality reduction.
Analytics risks can keep you up at night. What if…
· We make a big investment and don’t break even?
· Management doesn’t trust the results?
· Analysts cross data privacy boundaries?
What a dilemma! You see the perils, yet you want the rewards that analytics can bring. The appropriate process enables you to dramatically reduce risks and maximize returns on your data and analytics investment.
In this presentation, you will learn:
· What causes most analytics failures
· How you can diminish risk and maximize returns through strong analytics process
· Why you (yes, you!) have a pivotal opportunity to establish high standards for analytics process right now
The recent ransomware outbreaks have destabilized business operations around the world.
The most recent ransomware scare came from what appeared to be a new variant of the Petya ransomware. Leveraging exploits and techniques similar to WannaCry, along with other advanced techniques, to cause damage by sabotaging systems, this latest attack clearly demonstrates how damaging malware can and likely will continue to be to organizations.
How can you proactively prepare for such threats?
Watch this webinar to learn how to apply a broader analytics-driven approach to do the fundamentals better, and minimize the risk that your organization will be affected.
This session will include live demonstrations, and will cover best practices in the following areas:
• Security fundamentals – the importance of consistent blocking/tackling and security hygiene
• Posture assessment – establishing end-to-end visibility of potential ransomware activity
• Investigation, hunting and remediation – IR techniques to verify alerts and hypotheses, and prioritize based on risk
• Threat intelligence – identifying C2, file hashes and other ransomware IoCs
• Automation and orchestration – integrating a layered security architecture to drive to faster decisions
• Leveraging machine learning to detect ransomware patterns and adapt threat models for the latest mutations
In this technical session, we will learn about the different use cases around Amazon Redshift and discuss how Informatica Intelligent Cloud Services is used to implement these use cases. Through real world examples, you’ll learn about best practices, when to use ETL versus ELT, performance tuning, and new features recently introduced.
Key webinar learnings:
• Amazon Redshift: Key benefits, capabilities, and use case patterns
• Informatica Intelligent Cloud Services for Amazon Redshift and other Amazon services
• Best Practices
• Deep Dive Demo of key use cases
Performance is often a key factor in choosing big data platforms. Over the past few years, Apache Spark has seen rapid adoption by enterprises, making it the de facto data processing engine for its performance and ease of use.
Since starting the Spark project, our team at Databricks has been focusing on accelerating innovation by building the most performant and optimized Unified Analytics Platform for the cloud. Join Reynold Xin, Co-founder and Chief Architect of Databricks as he discusses the results of our benchmark (using TPC-DS industry standard requirements) comparing the Databricks Runtime (which includes Apache Spark and our DBIO accelerator module) with vanilla open source Spark in the cloud and how these performance gains can have a meaningful impact on your TCO for managing Spark.
This webinar covers:
Differences between open source Spark and Databricks Runtime.
Details on the benchmark including hardware configuration, dataset, etc.
Summary of the benchmark results which reveal performance gains by up to 5x over open source Spark and other big data engines.
A live demo comparing processing speeds of Databricks Runtime vs. open source Spark.
Special Announcement: We will also announce an experimental feature as part of the webinar that aims at drastically speeding up your workloads even more. Be the first to see this feature in action. Register today!
Today businesses of varying sizes, products and services, and industries are actively applying Predictive Business Analytics in seeking to improve greater customer engagement and resulting increased revenues and profits. Often these businesses are looking to their finance group to guide, if not lead, these efforts to successful outcomes.
In a recent article written by Ray Tong, he cited a CFO magazine survey that:
• Over 70% of respondents said that they plan to substantially increase the use of data analytics to support decision making and improve business partnering;
• 68% of respondents, the majority of who were finance executives, said that they plan to improve their data analytic skills in the coming year.
Consequently, it is essential that the financial professional understands what it is going to take for them to succeed.
This webinar discusses the five (5) keys to applying Predictive Business Analytics in your organization. It provides attendees with specific and practical insights to guide their approaches to develop and deploy an effective process that improves managerial decision making across many core performance and financial areas.
What you will learn:
• What are the key steps to success with practical examples
• What steps to avoid
• How to ensure and gain organizational impact
Larry Maisel a recognized author and thought leader in Business Analytics, with experience assisting companies to improve operating performance and business results. An accomplished business professional, with over 20+ years of experience, with proven leadership skills, expertise to drive bottom-line profits and implementing business systems.
Migrating your infrastructure from on-premises to the cloud requires extensive monitoring to ensure your IT operations don't suffer along the way. For many organizations, garnering this insight can be difficult -- if not impossible. Join Blue Medora's Mike Langdon to learn key best practices to monitor your infrastructure during migration, helping you simplify the process and ensure optimal performance from migration to decommissioning.
Tensorflow is an open source software library for numerical computation and machine learning.
Join this session where Marwa will discuss:
-Introduction to Artificial intelligence, machine learning and deep learning
-Sample of machine learning applications
-Tensorflow Story, Model and windows installation steps with object recognition demo.
Artificial Intelligence (AI) is not a technology for the future; it’s a huge business opportunity for today. But how can your organisation become a trailblazer for AI innovation, transforming the way you work to deliver immediate – and lasting – bottom line value?
Former CERN scientist, Prof. Dr. Michael Feindt, is one of the brightest minds in Machine Learning. Join him for a 30-minute masterclass in how to apply AI to your business.
You’ll learn how AI can:
•Make sense of market and customer complexity, to deliver quick and effective decisions every single day
•Increase workforce productivity to improve output and staff morale
•Enhance decision-making and forecasting accuracy, for operational efficiency and improved productivity
•Be implemented into your business quickly, easily, with minimal disruption
Michael will also share real-life examples of how international businesses are using AI as a transformation tool, from his experience as founder of market-leading AI solution provider, Blue Yonder.
Fraud detection is a classic adversarial analytics challenge: As soon as an automated system successfully learns to stop one scheme, fraudsters move on to attack another way. Each scheme requires looking for different signals (i.e. features) to catch; is relatively rare (one in millions for finance or e-commerce); and may take months to investigate a single case (in healthcare or tax, for example) – making quality training data scarce.
This talk will cover a code walk-through, the key lessons learned while building such real-world software systems over the past few years. We'll look for fraud signals in public email datasets, using IPython and popular open-source libraries (scikit-learn, statsmodel, nltk, etc.) for data science and Apache Spark as the compute engine for scalable parallel processing.
David will iteratively build a machine-learned hybrid model – combining features from different data sources and algorithmic approaches, to catch diverse aspects of suspect behavior:
- Natural language processing: finding keywords in relevant context within unstructured text
- Statistical NLP: sentiment analysis via supervised machine learning
- Time series analysis: understanding daily/weekly cycles and changes in habitual behavior
- Graph analysis: finding actions outside the usual or expected network of people
- Heuristic rules: finding suspect actions based on past schemes or external datasets
- Topic modeling: highlighting use of keywords outside an expected context
- Anomaly detection: Fully unsupervised ranking of unusual behavior
Apache Spark is used to run these models at scale – in batch mode for model training and with Spark Streaming for production use. We’ll discuss the data model, computation, and feedback workflows, as well as some tools and libraries built on top of the open-source components to enable faster experimentation, optimization, and productization of the models.
With old-school device fingerprinting, it’s easy to stop or allow known devices, but with the explosion of the number of new devices and companies engaging customers via multi-channels, the technology falls short in helping you identify transactions that are truly risky or good.
In this webinar, you’ll see how device intelligence with machine learning allows you to derive more accurate fraud and risk insights from large amounts of device engagement data.
Artificial intelligence has greatly changed the way we live since the 20th century. It involves the science and engineering of making machines intelligent and autonomous using computer programs.
The processing power of computers has been on the exponential increase with cost of processors and storage decreasing. This has made research and developments efforts in AI areas such as deep learning, once thought to be impossible possible.
In this webinar, we will examine current methods, application domains of specific methods, their impacts on our daily lives and try to answer questions on ethics of applying these technologies.