Welcome to the big data and data management community on BrightTALK. Join thousands of data quality engineers, data scientists, database administrators and other professionals to find more information about the hottest topics affecting your data. Subscribe now to learn about efficiently storing, optimizing a complex infrastructure, developing governing policies, ensuring data quality and analyzing data to make better informed decisions. Join the conversation by watching live and on-demand webinars and take the opportunity to interact with top experts and thought leaders in the field.
It’s not everyday you get the inside story on how a group of cybersecurity researchers stumbled upon an APT, an advanced persistent threat, when they were examining the intelligence data from their security kit.
It appeared harmless and boring but of advanced attributes. But Fleming Shi and Jonathan Tanner suspected something was amiss.
Facilitated by Amar Singh, practising CISO, on this exclusive webinar as he unpicks what Fleming and Jonathan did next. Their curiosity got the better of them and they set about tinkering to discover the true intentions of this benign malware.
This interactive webinar features the Actian Vector analytics database. Mary will provide a product overview and a demo of how to quickly get started on Windows. Viewers can ask question during the event to be addressed during the live Q & A.
You can download the Evaluation Edition at https://www.actian.com/try-vector
Let’s push past the hype and the BS and look at the facts: Artificial intelligence has evolved to the point where any sales organization that leverages AI will see measurable improvements in customer engagement, LTV, and overall sales.
AI can take on the simplest tasks, like automation of admin work such as activity logging, flagging high-priority emails, and managing contacts in your CRM. It can swiftly and accurately sift through leads, deals, and accounts to identify what’s actually worth your time, and dynamically identify which leads are ready to convert. It’ll help you ditch the sandbagging with accurate quarterly forecasting — and more.
To learn more about how to sell smarter, harder, and more with AI, how to seamlessly integrate the technology into your organization, and to learn of real-world results from leading brands, don’t miss this VB Live event!
Register now for free!
Attend this webinar and learn:
* AI fact versus fiction when it comes to sales
* How to build a data- and AI-friendly sales organization
* How leading brands build real results and how they do it
* Which AI tools actually bring results and which are still in development
* What's next for AI and sales?
* Rick Winslow, VP, Head of Digital Innovation & Transformation, Capital One Commercial Banking
* Jenny Lin, Data Scientist, Yelp
* Ksenia Kouchnirenko, Head of Business Systems, SurveyMonkey
* Marlene Jia, COO & CoFounder, TopBots
* Rachael Brownell, Moderator, VentureBeat
Applications need data, but the legacy approach of n-tiered application architecture doesn’t solve for today’s challenges. Developers aren’t empowered to build and iterate their code quickly without lengthy review processes from other teams. New data sources cannot be quickly adopted into application development cycles, and developers are not able to control their own requirements when it comes to data platforms.
Part of the challenge here is the existing relationship between two groups: developers and DBAs. Developers are trying to go faster, automating build/test/release cycles with CI/CD, and thrive on the autonomy provided by microservices architectures. DBAs are stewards of data protection, governance, and security. Both of these groups are critically important to running data platforms, but many organizations deal with high friction between these teams. As a result, applications get to market more slowly, and it takes longer for customers to see value.
What if we changed the orientation between developers and DBAs? What if developers consumed data products from data teams? In this session, Pivotal’s Dormain Drewitz and Solstice’s Mike Koleno will speak about:
- Product mindset and how balanced teams can reduce internal friction
- Creating data as a product to align with cloud-native application architectures, like microservices and serverless
- Getting started bringing lean principles into your data organization
- Balancing data usability with data protection, governance, and security
Your network firewall isn’t being honest with you. According to a recent survey conducted by Sophos that gathered responses from 2,700 IT managers at mid-sized organization, network firewalls can’t disclose what up to 45% of the bandwidth is being consumed by. Additionally, they are failing to protect the organizations they are deployed in and this inefficiency is costing time and money. Why is this happening? What can be done to correct it? Join Sophos and (ISC)2 on June 14, 2018 at 12:00PM Eastern for a wide ranging discussion about this survey, the results and what can be done to get the best information and performance from your network firewall.
Most disaster recovery plans are sorely out of date. They were crafted before data started to double every two years, before cloud, virtual machines and containers took over, before digital transformation amplified the importance of data, and before cybercrime became a bigger data threat than natural disasters.
What’s worse, IT shops typically have a range of DR point solutions cobbled together with bailing wire, and the limitations of these “systems” are becoming evident as companies scale to meet new digital demands.
In this webinar we’ll examine:
- The challenges represented by environment fragmentation (on-premises, cloud, hybrid) and system complexity How new threats such as ransomware change the calculus
- The role cloud can play in recovery
- The promise of orchestration/automation
This panel, from In:Confidence 2018, hosted by The Exponential View's Azeem Azar discusses considerations for privacy as the world of AI rapidly grows and develops.
- Azeem Azhar (chair), The Exponential View & Accenture
- Andrea Mestriner, Head of Analytics and Data Visualisation, Just Eat
- Jeni Tennison, CEO, The Open Data Institute
- Sherif Elsayed-Ali, Director of Global Issues and Research, Amnesty International
The panelists looked to explore several key themes including the rise of AI; what it means for the consumer; considerations for data privacy and upcoming risks and opportunities for growth.
It's an engaging debate, from an industry-leading selection of speakers. Make sure to explore the content.
Watch this important presentation by Jonathan Schabowsky, Senior Architect at Solace, for a fresh perspective on what your IT/OT organization can do to enable seamless and performant data movement across your hybrid cloud.
Gain new insights into:
-The forces driving ever great distribution of your enterprise data.
-The challenges that growing data distribution pose to your IT agility.
-The enterprise and future proofing requirements that you’ll want keep in mind as you navigate data distribution challenges .
Today’s digital organizations are generating more data and content than ever before and are faced with increased scrutiny and increased regulations for how they manage, govern and interact with their content. This is a challenge for all modern organizations running on a complicated array of new and legacy platforms and receiving content in every format imaginable.
In this 30 minute demonstration led by ASG’s Mobius Product and Solutions teams, you’ll learn:
• How built-in encryption at rest and redaction capabilities ensure that private information remains private whether deployed on-premises, in the Cloud or in Hybrid/Cloud environments.
• How event-based retention and hold functionality can broaden your governance reach and reduce unwanted risk and exposure of your enterprise content.
• How capabilities like full text search can empower your business to better leverage dormant content and respond faster to audit requests.
Organizations that use cloud services to handle the Internet of Things often employ one of the three largest hyperscale providers — Microsoft, Google or AWS. In this webinar, we reveal the results of an extensive study, using machine-learning methods, to identify what factors drive the cloud-related costs of Internet of Things deployments, and in what scenarios each cloud provider has a cost advantage.
Data warehousing projects are inherently risky. Traditional waterfall methods usually go over budget and take months, if not years, to implement. Because of their complexities, they create unnecessary dependencies and roadblocks.
In this webinar, learn how to take an iterative data warehousing approach instead. See how a simplified architecture helps your team prove value early, reduces risk in the long run, and creates an agile, high-performance analytics culture.
In this webinar, you will learn:
-The benefits and challenges of adopting a cloud data warehouse
-New tools and approaches to modern analytics and ETL/ELT
-How to quickly and easily transition to agile analytics
-The long-term value of a simple data architecture
In this live webinar, Alex will present the the data management challenges and solutions associated with deploying applications at the edge for IoT projects and upstream for mixed OLTP and analytics database workloads. The presentation includes live Q&A.
In this webinar, learn how popular e-commerce site, ASOS, adopted the Azure cloud for real-time analytics. Hear their solutions architect discuss their data integration challenge and how they overcame it with selected technologies.
Certain things go together to make the sum of their parts that much better. Peanut Butter and Jelly. Lennon and McCartney. Batman and Robin. In the ever-changing world of the cloud, cyber security professionals need continuous training and certifications to stay up-to-speed and pairing (ISC)2’s CCSP (Certified Cloud Security Professional) with CSA’s CCSK (Certificate of Cloud Security Knowledge) can put any cyber security practitioner ahead in terms of knowledge, skills and job opportunities. On June 12, 2018 at 1:00PM Eastern, join David Shearer, (ISC)2’s CEO and Jim Reavis, CSA’s CEO, along with other subject matter expects as we explore the differences between each program, the training options available for each, and how these programs are synergistic in nature and together were designed to build on one another.
The emergence of next generation distributed energy resources (solar, wind, battery) and storage technologies have challenged the capabilities of traditional, central-station power grid systems -- until now! Advances in embedded systems, analytics, machine learning and time-sensitive networking allow tightened integration, thereby digitizing the grid and addressing technical challenges previously impossible to solve.
This webinar will discuss new approaches to address operational challenges by integrating communications and control technology directly to the grid. Explore the IIC Testbed Program and learn how the award-winning Microgrid and TSN testbeds provide a rigorous testing environment for solving integration challenges in a multi-vendor, real-word environment based on industry standards (DDS, OpenFMB,TSN) and other emerging industrial IoT capabilities.
With insatiable growth of research storage demands, traditional storage solutions have ceased being cost-effective or viable. Konstantinos Mouzakitis, Senior HPC Systems Engineer at Boston Limited (a leading system integrator in the UK and EU) and Caringo Object Storage Solutions Architect Alex Oldfield talk about the benefits of using object storage in HPC use cases in this webinar.
Poor data quality is enemy number one to the widespread, profitable use of machine learning.” A scary claim. Especially because so many of us know the issue exists, but we don’t know how to address it.
In this webinar, we will first explore why the “garbage-in, garbage-out” problem in analytics and decision making has remained intractable for generations. Then we’ll explore concrete steps you can take to start getting your data quality issues under control.
You’ll also have an opportunity to ask your own questions and get expert answers gleaned from practical experience, applying hard-won lessons to the incredibly steep quality demands of machine learning. By the end of the webinar, you’ll come away with a better insight into the challenges and approaches to creating a comprehensive and well-executed data quality program.
Organizations across many different industries, producing, selling or distributing products and services and investing in omnichannel customer experience and digital transformation initiatives are challenged to improve operational efficiencies, reduce costs and speed-up time to market. Artificial Intelligence (AI) and Machine Learning (ML) can be a game-changer in helping them achieve their goals.
In this webinar, Christian Farra, Product Specialist MDM at Informatica will share insights how companies can leverage AI for product information management (PIM) taking the next step to intelligently fuel their data-driven digital transformation.
Join this webinar to hear how AI, Natural Language Processing (NLP) and ML can simplify your product data enrichment process while reducing costs. This will be showcased based on use-case examples like auto-classification, attribute extraction or image classification.
Ransomware seems to be in the news more and more these days. However, it is rare that you hear about successful recoveries from an attack.
Join our webinar to hear a global healthcare laboratory data expert, Matthew Magbee, share how his company recovered after they were hit with a CryptoLocker virus. He will share the strategies that were employed to identify, react, and recover their data quickly and effectively.
Commvault Senior Solutions Manager, Gregg Ogden and Senior Director of User Data Group, Ashish Morzaria, will join the conversation to discuss how Commvault provides the threat and risk mitigation solutions you need to help ensure business continuity.
Join this webinar to learn:
- How this global healthcare company recovered after an attack
- Threat and risk mitigation solutions from Commvault
- The importance of a maintaining independent data protection
Transform your business with data center modernization. Learn how Hitachi Vantara’s robust set of systems, software and services are designed to help customers achieve desired business outcomes in the data center.
In this webinar, you’ll:
- Learn how to reduce cost, maximize efficiencies and effectiveness across all data center operations.
- See real-life examples of how Hitachi provides solutions to increase agility and improve customer experiences.
- Discuss key elements needed to achieve a modern data center.
The Internet of Things (IoT) presents an unparalleled opportunity for organizations to break new ground with innovative products and solutions. It will also give consumers the ability to efficiently access goods and services in real-time at any desired location. However, businesses must continue to ensure maximum protection for their assets and the data they hold.
This webinar presents the case for:
· Why security efforts need to be focused and comprehensive when implementing an IoT strategy
· The need to have a robust structure from which to base your security and privacy controls
· Why a 5-point framework is a highly effective approach
The discussion will also leverage industry examples to help you understand how integrating good cyber security controls into your innovative IoT solutions will help meet your critical business objectives.
Security teams are increasingly using User and Entity Behaviour Analytics (UEBA) to detect, prioritise, and respond to anomalous and alarming user behaviour.
Hear from a LogRhythm customer, Stephen Frank, director of technology & security at National Hockey League Players Association (NHLPA), sharing how his team has applied UEBA to meet their security needs. Along with Damon Gross from LogRhythm, they will share use cases and how LogRhythm is supporting their security initiatives.
Join us to discover
• Why UEBA is a critical component to effective security
• A customer's security environment challenges and key use cases
• Innovations and advancements in UEBA
• A short showcase of UEBA capabilities from inside the LogRhythm platform
Register now to get an inside look at how NHLPA is enhancing their UEBA capabilities.
Join Esther Spanjer, Director of Business Development EMEIA at Western Digital and Janusz Bak, CTO of Open-E for this webinar. They will discuss the challenges that Aviation Accounting Center LLC, an engineering company in Geospatial data processing was facing when planning for expansion of their IT infrastructure. Their existing standalone servers did not provide the capacity, availability and performance needed for storing and accessing its geospatial data. Esther and Janusz will walk you through the proposed solution and how it met the customer’s needs on scalability, capacity, throughput, connectivity and high-availability requirements.
APIs and microservices are awesome if you're a technical developer, but what if you're not and you still need to understand how they connect?
In this webcast, Leon Stigter and Bruno Trimouille of TIBCO Software introduce how low-code platforms can help marketing and sales teams to automate their workflows and deliver on business goals without getting under the hood.
In this webcast we cover:
-A primer on APIs, microservices, and low-code application development
-A deep dive into digital business platforms and their required capabilities
-How different types of users can leverage a digital business platform for shared benefits
Today’s data centers are expected to deploy, manage, and report on different tiers of business applications, databases, virtual workloads, home directories, and file sharing simultaneously.
HPE 3PAR StoreServ is highly efficient, flash-optimized storage engineered for the true convergence of block, file, and object access to help consolidate diverse workloads efficiently.
This session provides an overview HPE 3PAR File Persona and core file data services.
Data is everywhere! Growing exponentially in size, and complexity too. In this session, we look specifically at the ‘Holy Grail’ of Single Customer View, and at some of these techniques required to achieve that in your own data. But why is it exclusive to Customer Data? Isn’t product data is just as important to a Product manager as Customer Data is to a Marketer or Salesperson? Davinity explores the DQ techniques which can be applied to any type or kind of data to keep it clean and fit for purpose. And is there any difference in achieving such high Data Quality in a Big Data space over more traditional environment? Tune in to find out…
There is a growing demand for high-speed analytic database systems and platforms where data can be processed at lightning speed for business decision-making. However, is that data up-to-date? If the process to update that data is batch or overnight, the data delivery may be fast, but the information may be outdated which can result in error-prone reporting and ineffective (or disastrous) business decisions.
For successful analytics it’s important to move data in real time from multiple systems, including legacy databases, so that combined data can be properly presented in a uniform way.
Change Data Capture is a technology employed in the best data replication solutions to minimize the amount of time and resources necessary to update data changes from one system to another. By propagating only what’s changed since the last check on the source system, minimal data is passed between systems in order to ensure identical copies of data on both source and target.
Change Data Capture is frequently used to select and update copies of key data for operational reasons such as maintaining activity on separate systems that interact externally with customers or vendors, or remote systems that interact with corporate databases. However, it is also useful to maintain freshly updated data in systems that are used extensively for non-transactional activity, such as for analysis, reporting and data warehousing.
1.Retrieving information from any source database for your analytic system
2.Specialized and configured support to load data into the analytic system
3.Automatically prepare and map the data between systems, saving time and resources
4.Best-in-class process to automatically and continuously update data
5.Change Data Capture saves time, money and effort through efficient and cost-effective technology
Legacy development approaches and tools simply aren’t architected to handle the enormous flow of real-time, event-driven, data streams generated by IoT devices, existing systems, and people.
In this session, learn how the revolution in event-driven application platforms enable innovative companies to develop, deploy and run real-time enterprise applications with dramatically reduced time-to-market, significantly lowered development and maintenance costs, and maximized agility in the face of requirements for continuous innovation and digital transformation.
Having the right processes run on the right platforms makes sense, but meeting this challenge can be harder than it seems. In many cases, these processes have been running on their current platforms for years and the team that set them up is no longer with the company. This type of situation leaves the IT department with big software maintenance costs and outdated infrastructure.
Join ASG and Destiny on June 19th at 1:00pm EST for a live webinar as we share information and guidance to help IT managers modernize their legacy environment, including how to:
•Understand upstream/downstream application interactions
•Refactor to a modern toolset and architecture
•Surface “unseen” personal data that may impact compliance risk
•Leverage change analysis and application understanding into a foundation for regulatory compliance and ongoing application management
DeepStorage Labs is known in the storage industry for pushing equipment to its limits, and for reporting what really happens at the edge of a system’s performance. Tegile’s IntelliFlash T4000, unlike a few previous occupants of the DeepStorage Labs ThunderDome, stood up to our testing and delivered high IOPS at a maximum of 1MS latency.
DeepStorage subjected the InteliFlash T4000 workloads from the usual 4KB “hero number” random read to workloads that simulate OLTP and OLAP database servers, a file server and an Exchange server. We determined the systems performance individually and in combination finally determining the system’s ability to support the kind of mixed workload environment
In this webinar we will:
- Introduce the IntelliFlash array
- Describe the testing process
- Present the results
- Review the test environment
- Provide links to the test workload VDbench configurations
To keep pace with today’s media and digital asset management workflows, you need a cost-effective secondary tier of storage (active archive) that provides instant accessibility and unrelenting data protection—while scaling to store petabytes of unstructured data and billions of files. Caringo Senior Consultant John Bell and Engineer Jose Juan Gonzalez will explain how object storage (using NoSQL, unstructured methods of search like Elasticsearch, and advanced metadata and content management capabilities) can be used to build this active archive and will illustrate use with a live demo of how Caringo Swarm integrates with leading industry tools such as the CatDV media asset management (MAM).
Explore the role of the IoT in the mining industry. From big industry to incubators and startups -- many organizations are engaging globally via innovative ecosystems, testbeds and tech hubs for education, awareness, and best practices.
На вебинаре мы рассмотрим новую СХД от Dell EMC – PowerMax, объявленную на Dell Technologies World. Основное внимание будет уделено архитектурным особенностям PowerMax и выгоде, которую они дают, основным отличиям от предыдущих моделей СХД, а так же рассмотрены текущие тренды в хранении данных. Вебинар проводится при поддержке компании Intel®
As organisations are seizing the opportunity to become more agile and enable transformation by intelligently using data, the need to act to defend against being disrupted by market, business, and technology forces has never been greater. With data increasingly becoming a business asset, organisations are looking to more intelligently deliver Data Governance as a way to maximise the value of their data.
During this webinar Informatica will explore three fundamental requirements for Intelligent Data Governance:
•Collaboration around data, across an organisation
•Integration of organisational knowledge around data
•Automation of the discovery of data, across an organization
By combining these three requirements, plus more, Informatica will show how Intelligent Data Governance is rapidly changing the way organisations govern their most precious asset: their data.
To improve product availability and product quality within your physical and online retail stores, you need to understand what’s happening at each step in a product's journey up to the point of sale - this is your flow-of-goods.
In this webinar, retail analytics expert Simon Runc will demonstrate how to visually analyse your data at every point along the flow-of-goods, including your supply chain data and point of sale data, so you can make quicker and better product decisions in real-time.
Learn how to visually analyse your retail flow-of-goods data so you can...
*See in real-time what’s happening to your products
*Combine your supply chain data and point of sale data for greater insight
*Reduce stock-holding but improve on-shelf availability
*Take action before low stock availability becomes an issue
*Empower suppliers to act ahead of difficulties
Becoming more competitive with big data today means having the right technology to uncover new insights from your data and make critical business decisions in real time. Qubole and Microsoft help companies activate their big data in the cloud to uncover insights that improve customer engagement, increase revenue, and lower costs.
Join experts from Qubole and Microsoft as they discuss how to activate your big data and how to get the most out of open source technologies on the cloud. In this webinar, you'll learn:
- How to modernize with data lakes and data warehouses on the cloud
- Strategies for boosting business value out of Machine Learning and advanced analytics with Qubole on Azure
- How to reduce costs, control risks, and improve data governance as you build your data pipelines
- The importance of data security and privacy
- Real world examples of successful companies activating their big data
Americas Global Black Belt, Data & AI at Microsoft
Nate Shea-han has been with Microsoft for 14 years and has spent the last 8 years focused on the helping Microsoft customers transform their business in the cloud on the Azure platform. Currently he has responsibilities across the United States, Canada and Latin America for Microsoft’s AI, big data, and analytics offerings. Nate has also worked extensively with Microsoft partner community.
Shaun Van Staden
Solutions Architect, Qubole
Shaun Van Staden has 19 years of experience in enterprise software managing advanced analytics projects, as a developer, DBA, business analyst and now a solutions architect. As a solutions architect manager, Shaun is responsible for supporting business development and sales at Qubole and helping customers transform their use cases for the cloud. Prior to Qubole, Shaun worked as a solutions architect at NICE Systems and Merced Systems (acquired by NICE).
Believe it or not, there was a time when collecting potentially responsive ESI from email systems for discovery was once considered overly burdensome. Now, it’s commonplace and much of it can be automated. But, that’s not where all of the responsive ESI resides today – much of it is on your mobile device, in social media platforms and even in Internet of Things (IoT) devices. Are you ignoring this potentially important data? Do you have to hire a forensics professional to collect this data or can you do much of it on your own? This CLE-approved* webcast will discuss what lawyers need to know about the various sources of ESI today, examples of how those sources of data can be responsive to litigations and investigations, and how lawyers may be able to collect much of this data today using intuitive applications and simple approaches. Topics include:
+ Challenges from Various Sources of ESI Data
+ Ethical Duties and Rules for Understanding Technology
+ Key Case Law Related to Mobile Devices, Social Media and IoT
+ Options and Examples for Collecting from Mobile Devices
+ Options and Examples for Collecting from Social Media
+ Examples of IoT Devices and Collection Strategies
+ Recommendations for Addressing Collection Requirements
+ Resources for More Information
* MCLE Approved in Selected States
Presentation Leader: Doug Austin
Doug is the VP of Products and Professional Services for CloudNine. At CloudNine, Doug manages professional services consulting projects for CloudNine clients. Doug has over 25 years of experience providing legal technology consulting, technical project management and software development services to numerous commercial and government clients.
Special Consultant to CloudNine: Tom O'Connor
Tom O’Connor is a nationally known consultant, speaker, and writer in the field of computerized litigation support systems. Tom’s consulting experience is primarily in complex litigation matters.
Efforts to derive value from Edge Computing and IoT continue to grow. In addition to the multitude of PoCs taking place in the Cloud, scaled up implementations are increasingly looking at local persistent data and the underlying device and gateway technologies are evolving to accommodate this new design requirement. Central to this shift is the growth of developer communities around Android, iOS, embedded Linux and Windows, Intel/, ARM and others, driving a surge of new applications with locally managed and analyzed data on IoT/OT edge devices and gateways.
Considerations for metadata management, data movement, security, and governance of data associated with these apps, overhead for processing and packaging, and security concerns will require on-device and gateway persistent storage. Flat file systems and other prior tools don’t meet modern embedded data management platform or analytics requirements.
Join us to hear how to deliver persistent data management at the edge.
- Avoid the hassle and security complications that ETL between things, gateways and datacenters will create.
- Experience a real-world demo in Python with source code samples showing how you can code once and deploy to multiple target platforms.
Artificial Intelligence is a part of our daily lives through the use of technologies like virtual assistants such as Cortana, smart homes, and automated customer service. We also have the power of the Internet of Things technology in organizations. How can we put them together for success in our organizations?
Businesses are running the Red Queen's race not just to win, but to survive in a world where Artificial Intelligence and IoT are becoming the present as well as the future of technology, and ideas are developing into reality at accelerated rates.
How can you help your company to evolve, adapt and succeed using IoT and Artificial Intelligence to stay at the forefront of the competition, and win the Red Queen's Race? What are the potential issues, complications, and benefits that the future of technology could bring to us and our organisations, right now?
In this session, Jen Stirrup will explain the quick wins to win the Red Queen's Race in AI and IoT to help you and your organization to win the Red Queen's race.
For companies to realize the full potential of IoT enablement, they need to combine IoT with rapidly-advancing Artificial Intelligence technologies, which enable ‘smart machines’ to simulate intelligent behavior and make well-informed decisions with little or no human intervention.
Join this webcast to learn and discuss the best practices and trends coming for AI in IoT.
About the speaker:
Ahmed Banafa has extensive experience in research, operations and management, with focus on IoT, Blockchain and AI. He is a reviewer and a technical contributor for the publication of several technical books. He served as a faculty at well-known universities and colleges, including the University of California, Berkeley; California State University-East Bay; San Jose State University; and University of Massachusetts. He is the recipient of several awards, including Distinguished Tenured Staff Award of 2013, Instructor of the year for 2013, 2014, and Certificate of Honor from the City and County of San Francisco. He was named as number one tech voice to follow by LinkedIn in 2016, his researches featured in many reputable sites and magazines including Forbes, IEEE and MIT Technology Review, and Interviewed by ABC, CBS, NBC and Fox TV and Radio stations.
This webinar will focus on the issues surrounding the “Total Installed Cost” (TIC) of Backbone/Riser fiber cabling. We will explore the contributing factors to variability of TIC, focusing on field termination skill and the impact of permanent link testing on the individual connector PASS/FAIL decision making process (via a mathematical model).
We will also review the development of new tools and novel methods to mitigate errors that estimate Insertion Loss (IL) in field terminated connectors, including new “expert system” technology deployed to test connector IL during the termination sequence.
Based on IIA's new and updated research on organizing analytics teams, Research Advisor Bob Morison will review the objectives and variables of organizational structure and share examples of how enterprises are adjusting their structures to deploy and develop analysts effectively, incorporate new methods and technologies, address strategic business opportunities, and leverage the analytics ecosystem.
The nature of enterprise data is rapidly changing and existing storage infrastructures can’t keep up. Network Attached Storage (NAS) devices were designed for performance and single-site collaboration but file creation and access is different now. Many turn to the cloud to offload data, however, for rapidly scaling data sets, daily transfer rates and bandwidth constraints are an issue. In addition, some sensitive information can’t leave your data center. Komprise and Caringo have partnered to solve these issues by pairing intelligent data management technology with hassle-free, limitless storage.
Attend this webinar to learn how you can slash TCO for rapidly scaling data sets by identifying data to move from NAS. Then securely transferring it based on value to Caringo Swarm scale-out object storage were it is protected without backups and instantly and securely available internally or externally.
As enterprises transition their Business Intelligence and Analytics environments to Machine Learning and Artificial Intelligence driven ecosystems, their core data infrastructure has to scale. Focusing only on the compute layers, creates a highly inefficient infrastructure. Vexata with its VX-OS version 3.5 release brings to market transformative economics and breakthrough performance to power these next-generation workloads at scale.e.
You will learn about:
• How to scale core data infrastructures for the transition to Machine Learning and Artificial Intelligence workloads
• What are the key considerations before creating an AI/ML-centric storage infrastructure
• How Vexata's new VX-OS version 3.5 release addresses these challenges
We get recommendations everyday: Facebook recommends people we should connect with; Amazon recommends products we should buy; and Google Maps recommends routes to take. What all these recommendation systems have in common are data science and modern software development.
Recommendation systems are also valuable for companies in industries as diverse as retail, telecommunications, and energy. In a recent engagement, for example, Pivotal data scientists and developers worked with a large energy company to build a machine learning-based product recommendation system to deliver intelligent and targeted product recommendations to customers to increase revenue.
In this webinar, Pivotal data scientist Ambarish Joshi will take you step-by-step through the engagement, explaining how he and his Pivotal colleagues worked with the customer to collect and analyze data, develop predictive models, and operationalize the resulting insights and surface them via APIs to customer-facing applications. In addition, you will learn how to:
- Apply agile practices to data science and analytics.
- Use test-driven development for feature engineering, model scoring, and validating scripts.
- Automate data science pipelines using pyspark scripts to generate recommendations.
- Apply a microservices-based architecture to integrate product recommendations into mobile applications and call center systems.