Video interview: Big Data Challenges: What to do and where to go?
Listen to our interview at Big Data LDN with Jason Foster, Director & Founder at Cynozure.
Jason will discuss:
-The value of Big Data and which skills are required to deliver that value
-How to get started with Big Data projects
-What to do if progress is limited
-Business opportunities around customer insight, supply chain analytics, and more
RecordedNov 29 20166 mins
Your place is confirmed, we'll send you email reminders
Cory Minton, Pratik Verma, William McKnight and Rob Antczak
This webinar is part of BrightTALK's What's Big in BI series.
The data revolution has resulted in a seemingly endless variety of tools and techniques designed to extract insights from big data and gain a decisive competitive edge in the marketplace. Many successful organizations across major industries like Financial Services, Healthcare and Retail, now analyze the data generated by everyday business activities and use the resulting insights to inform critical business decisions.
Tune in to learn more about how industry leaders are becoming Modern BI powerhouses and get expert panelist advice on how to accelerate your own business intelligence and analytics strategy in 2019 and beyond.
Moderator:
Cory Minton, Founder & CEO of Big Data Beard
Panelists:
Pratik Verma, CEO of BlueTalon
William McKnight, President of McKnight Consulting Group
Rob Antczak, CTO of PowerAdvocate
Cory Minton, Pratik Verma, William McKnight and Rob Antczak
The data revolution has resulted in a seemingly endless variety of tools and techniques designed to extract insights from big data and gain a decisive competitive edge in the marketplace. Many successful organizations across major industries like Financial Services, Healthcare and Retail, now analyze the data generated by everyday business activities and use the resulting insights to inform critical business decisions.
Tune in to learn more about how industry leaders are becoming Modern BI powerhouses and get expert panelist advice on how to accelerate your own business intelligence and analytics strategy in 2019 and beyond.
Moderator:
Cory Minton, Founder & CEO of Big Data Beard
Panelists:
Pratik Verma, CEO of BlueTalon
William McKnight, President of McKnight Consulting Group
Rob Antczak, CTO of PowerAdvocate
Richard Corderoy, Oakland Data & Analytics | Andy Mott, Arcadia Data
With just a few weeks to the UK's largest data & analytics event, we've gathered some of the elite speakers who will be taking the stage to debate the latest trends, hottest solutions and the biggest opportunities (and challenges) for businesses in a data-driven world.
* Fast Data & DataOps
* Self-Service Analytics
* Artificial Intelligence
* Customer Experience
* Data Governance
What will they be talking about at The Olympia, London, on the 13-14 November 2018, what do they want to hear about, what are they looking forward to?
Join this panel discussion and arm yourself for excellence in this brave new data-driven world.
Richard Corderoy, Chief Data Officer, Oakland Data and Analytics
Andy Mott, Senior Consultant, Arcadia Data
Greg McSorley, Amphenol; Rick Kutcipal, Broadcom; Kevin Marks, Dell; Jeremiah Tussey, Microsemi
The recent data explosion is a huge challenge for storage and IT system designers. How do you crunch all that data at a reasonable cost? Fortunately, your familiar SAS comes to the rescue with its new 24G speed. Its flexible connection scheme already allows designers to scale huge external storage systems with low latency. Now the new high operating speed offers the throughput you need to bring big data to its knobby knees! Our panel of storage experts will present practical solutions to today’s petabyte problems and beyond.
For years, banks have been sitting on a goldmine of customer data. Only recently have they started exploiting that, although not surprisingly for their own benefit.
Personal data can give great insights to drive bank outcomes by decreasing credit losses and reducing fraud losses. In this webinar Paul Clark, CTO, will look at how we can use customer data to;
* Drive customer’s own advantage
* Avoid slip ups
* Dodge nasty charges
* Optimise the customer’s finances end to end.
Kirk Borne, Principal Data Scientist, Booz Allen Hamilton & Andreas Blumauer, CEO, Managing Partner Semantic Web Company
Implementing AI applications based on machine learning is a significant topic for organizations embracing digital transformation. By 2020, 30% of CIOs will include AI in their top five investment priorities according to Gartner’s Top 10 Strategic Technology Trends for 2018: Intelligent Apps and Analytics. But to deliver on the AI promise, organizations need to generate good quality data to train the algorithms. Failure to do so will result in the following scenario: "When you automate a mess, you get an automated mess."
This webinar covers:
- An introduction to machine learning use cases and challenges provided by Kirk Borne, Principal Data Scientist at Booz Allen Hamilton and top data science and big data influencer.
- How to achieve good data quality based on harmonized semantic metadata presented by Andreas Blumauer, CEO and co-founder of Semantic Web Company and a pioneer in the application of semantic web standards for enterprise data integration.
- How to apply a combined approach when semantic knowledge models and machine learning build the basis of your cognitive computing. (See Attachment: The Knowledge Graph as the Default Data Model for Machine Learning)
- Why a combination of machine and human computation approaches is required, not only from an ethical but also from a technical perspective.
Erik McBain, Strategic Account Manager, MindBridge Ai,
How are financial service firms around the world using machine learning systems today to identify and address risk in transactional datasets?
This webinar will look at a new approach to transaction analysis and illustrate how the combination of traditional rules-based approaches can be augmented with next-generation machine learning systems to uncover more in the data, faster and more efficiently.
We will span the various applications in banking, payments, trading, and compliance; looking at a variety of use cases from bank branch transaction analysis to trading data validation.
Anyone interested in financial technology, next-generation machine learning systems and the future of the financial services industry will find this webinar of specific interest.
About the speaker:
Erik McBain, CFA is a Strategic Account Manager for MindBridge Ai, where he specializes in the deployment of emerging technologies such as artificial intelligence and machine learning systems in global financial institutions and corporations. Over his 10-year career in banking and financial services(Deutsche Bank, CIBCWM, Central Banking), Erik has been immersed in the trading, analysis, and sale of financial instruments and the deployment of new payment, banking and intelligent technologies. Erik's focus is identifying the various opportunities created through technological disruption, creating partnerships, and applying a client-centered innovation process to create transformative experiences, products, and services for his clients.
Richard Peers, Director Financial Services Industry, Microsoft
Artificial Intelligence has a huge role to play in banking, no more so than in sustainable finance. However, data is very patchy and much source data is not available to inform Sustainable Finance. The challenge as we set off on this new journey is to make sure that the data and algorithms used are transparent and unbiased.
In this session, Richard Peers, Director of Financial Services industry at Microsoft will share how disruption and new entrants are bringing new business models and technology to play in banking as in other industries like the Auto Industry
One new area is sustainable Finance, a voluntary initiative as part of the COP agreement on climate change but the data to inform the markets is a challenge. Big Data, Machine Learning and AI can help resolve this.
But with such important issues at stake, this session will outline how AI much be designed to ethical principles
Tune in to this session for a high-level view of some key trends and technologies in banking. Get insight into sustainable finance; why AI can help and why Ethical AI is important; and the Microsoft principles for Ethical AI.
Ruturaj Pathak, Senior Product Manager, Networking BU, Inventec
We are seeing a sea change in networking. SDN has enabled improvements in network telemetry and analytics.
In this presentation, I will talk about the current challenges that are out there and how the technology change is helping us to improve the overall network telemetry. Furthermore, I will share how deep learning techniques are being used in this field. Please join this webinar to understand how the field of network telemetry is changing.
Dr Louise Beaumont (techUK), Natasha Kyprianides (Hellenic Bank), Tony Fish (AMF Ventures), Katrina Cruz (Anthemis Group)
The tragedy of the commons, first described by biologist Garrett Hardin in 1968, describes how shared resources are overused and eventually depleted. He compared shared resources to a common grazing pasture; in this scenario, everyone with rights to the pasture acting in self-interest for the greatest short-term personal gain depletes the resource until it is no longer viable.
The banking ecosystem and the data that binds it together is not all that different. For many years, through miss-selling scandals, cookie cutter products and dumb mass-marketing have seen players acting in their own interest in accordance to what they believe the ecosystem should look like, how it should evolve and who controls it.
But with the introduction of open banking, there are signs that new banking ecosystems are set to thrive. Taking Hardin’s notion, collaboration in the open banking future could benefit everyone in the ecosystem – the traditional banks, the FinTechs, the tech titans with their expertise in delivering services at scale, and yet-to-be-defined participants, likely to include the large data players such as energy firms, retailers and telcos.
Jean-Frederic Clere, Manager, Software Engineering, Red Hat
You can do a lot with a Raspberry and ASF projects. From a tiny object
connected to the internet to a small server application. The presentation
will explain and demo the following:
- Raspberry as small server and captive portal using httpd/tomcat.
- Raspberry as a IoT Sensor collecting data and sending it to ActiveMQ.
- Raspberry as a Modbus supervisor controlling an Industruino
(Industrial Arduino) and connected to ActiveMQ.
Denis Magda, Director of Product Management, GridGain Systems
The 10x growth of transaction volumes, 50x growth in data volumes and drive for real-time visibility and responsiveness over the last decade have pushed traditional technologies including databases beyond their limits. Your choices are either buy expensive hardware to accelerate the wrong architecture, or do what other companies have started to do and invest in technologies being used for modern hybrid transactional analytical applications (HTAP).
Learn some of the current best practices in building HTAP applications, and the differences between two of the more common technologies companies use: Apache® Cassandra™ and Apache® Ignite™. This session will cover:
- The requirements for real-time, high volume HTAP applications
- Architectural best practices, including how in-memory computing fits in and has eliminated tradeoffs between consistency, speed and scale
- A detailed comparison of Apache Ignite and GridGain® for HTAP applications
About the speaker: Denis Magda is the Director of Product Management at GridGain Systems, and Vice President of the Apache Ignite PMC. He is an expert in distributed systems and platforms who actively contributes to Apache Ignite and helps companies and individuals deploy it for mission-critical applications. You can be sure to come across Denis at conferences, workshop and other events sharing his knowledge about use case, best practices, and implementation tips and tricks on how to build efficient applications with in-memory data grids, distributed databases and in-memory computing platforms including Apache Ignite and GridGain.
Before joining GridGain and becoming a part of Apache Ignite community, Denis worked for Oracle where he led the Java ME Embedded Porting Team -- helping bring Java to IoT.
When monitoring an increasing number of machines, the infrastructure and tools need to be rethinked. A new tool, ExDeMon, for detecting anomalies and raising actions, has been developed to perform well on this growing infrastructure. Considerations of the development and implementation will be shared.
Daniel has been working at CERN for more than 3 years as Big Data developer, he has been implementing different tools for monitoring the computing infrastructure in the organisation.
Kirk Borne, Principal Data Scientist, Booz Allen Hamilton
As data analytics becomes more embedded within organizations, as an enterprise business practice, the methods and principles of agile processes must also be employed.
Agile includes DataOps, which refers to the tight coupling of data science model-building and model deployment. Agile can also refer to the rapid integration of new data sets into your big data environment for "zero-day" discovery, insights, and actionable intelligence.
The Data Lake is an advantageous approach to implementing an agile data environment, primarily because of its focus on "schema-on-read", thereby skipping the laborious, time-consuming, and fragile process of database modeling, refactoring, and re-indexing every time a new data set is ingested.
Another huge advantage of the data lake approach is the ability to annotate data sets and data granules with intelligent, searchable, reusable, flexible, user-generated, semantic, and contextual metatags. This tag layer makes your data "smart" -- and that makes your agile big data environment smart also!
James Serra, Data Platform Solution Architect, Microsoft
With new technologies such as Hive LLAP or Spark SQL, do you still need a data warehouse or can you just put everything in a data lake and report off of that? No! In the presentation, James will discuss why you still need a relational data warehouse and how to use a data lake and an RDBMS data warehouse to get the best of both worlds.
James will go into detail on the characteristics of a data lake and its benefits and why you still need data governance tasks in a data lake. He'll also discuss using Hadoop as the data lake, data virtualization, and the need for OLAP in a big data solution, and he will put it all together by showing common big data architectures.
Robin Marcenac, Sr. Managing Consultant, IBM, Ross Ackerman, Dir. Digital Support Strategy, NetApp, Alex McDonald, SNIA CSI
Watson is a computer system capable of answering questions posed in natural language. Watson was named after IBM's first CEO, Thomas J. Watson. The computer system was specifically developed to answer questions on the quiz show Jeopardy! (where it beat its human competitors) and was then used in commercial applications, the first of which was helping with lung cancer treatment.
NetApp is now using IBM Watson in Elio, a virtual support assistant that responds to queries in natural language. Elio is built using Watson’s cognitive computing capabilities. These enable Elio to analyze unstructured data by using natural language processing to understand grammar and context, understand complex questions, and evaluate all possible meanings to determine what is being asked. Elio then reasons and identifies the best answers to questions with help from experts who monitor the quality of answers and continue to train Elio on more subjects.
Elio and Watson represent an innovative and novel use of large quantities of unstructured data to help solve problems, on average, four times faster than traditional methods. Join us at this webcast, where we’ll discuss:
•The challenges of utilizing large quantities of valuable yet unstructured data
•How Watson and Elio continuously learn as more data arrives, and navigates an ever growing volume of technical information
•How Watson understands customer language and provides understandable responses
Learn how these new and exciting technologies are changing the way we look at and interact with large volumes of traditionally hard-to-analyze data.
After the webcast, check-out the Q&A blog http://www.sniacloud.com/?p=296
Dan Sommer Senior Director, Market Intelligence Lead at Qlik
It can be hard to keep up with the rapidly changing BI landscape. But it doesn't have to be. Reserve your spot at Qlik's annual BI Trends Webinar.
In this global webinar live replay, we’ll reveal the top BI Trends for the coming year and how they can help you transform your data. Join Qlik’s Global Market Intelligence lead and former Gartner analyst Dan Sommer to learn why 2018 is the year for the “desilofication of data.”
Recent events like the Equifax data leak and new regulations like the EU's General Data Protection Regulation have increased the urgency for further change in the BI landscape and to move data out of silos.
What is the right strategy and framework?
How can you easily move from "all data," to "combinations of data," to "data insights"?
Can data literacy and augmented intelligence create a data-driven culture?
The volume of data available to decision makers continues to be massive, and is growing faster than our ability to consume it. Learn how to move your data out of silos and turn your data into insights.
RIDE is an all-in-one, multi-user, multi-tenant, secure and scalable platform for developing and sharing Data Science and Analytics, Machine Learning (ML) and Artificial Intelligence (AI) solutions in R, Python and SQL.
RIDE supports developing in notebooks, editor, RMarkdown, shiny app, Bokeh and other frameworks. Supported by R-Brain’s optimized kernels, R and Python 3 have full language support, IntelliSense, debugger and data view. Autocomplete and content assistant are available for SQL and Python 2 kernels. Spark (standalone) and Tesnsorflow images are also provided.
Using Docker in managing workspaces, this platform provides an enhanced secure and stable development environment for users with a powerful admin control for controlling resources and level of access including memory usage, CPU usage, and Idle time.
The latest stable version of IDE is always available for all users without any need of upgrading or additional DevOps work. R-Brain also delivers customized development environment for organizations who are able to set up their own Docker registry to use their customized images.
The RIDE Platform is a turnkey solution that increases efficiency in your data science projects by enabling data science teams to work collaboratively without a need to switch between tools. Explore and visualize data, share analyses, all in one IDE with root access, connection to git repositories and databases.
Hélène Lyon, IBM, Distinguished Engineer, IBM Z Solutions Architect
IT is a key player in the digital and cognitive transformation of business processes delivering solutions for improved business value with analytics. This session will step by step explain the journey to secure production while adopting new analytics technologies leveraging mainframe core business assets
Big Data, Artificial Intelligence and Machine Learning
We will discuss how Big Data, Artificial Intelligence and Machine learning are rapidly impacting businesses and customers, enabling another massive shift through technology enablement. Todd DeCapua will share how these capabilities are being leveraged in Performance Engineering now, and into the future.
Join us for the next Quality & Testing SIG Talk on Tuesday, January 9, 2018: http://www.vivit-worldwide.org/events/EventDetails.aspx?id=1041157&group=.
Video interview: Big Data Challenges: What to do and where to go?Jason Foster, Director & Founder at Cynozure[[ webcastStartDate * 1000 | amDateFormat: 'MMM D YYYY h:mm a' ]]6 mins