Hi [[ session.user.profile.firstName ]]
Sort by:
    • Cost Effectively Run Multiple Oracle Database Copies at Scale Cost Effectively Run Multiple Oracle Database Copies at Scale Jeff Wright, Database Systems Architect, SolidFire Recorded: Feb 26 2015 4:00 pm UTC 41 mins
    • Scaling multiple databases with a single legacy storage system works well from a cost perspective, but workload conflicts and hardware contention make these solutions an unattractive choice for anything but low-performance applications.

      Attend the webinar to learn about:
      - How SolidFire’s all-flash storage system provides high performance at massive scale for mixed workload processing while simultaneously controlling costs and guaranteeing performance
      - How to deploy four or more database copies using SolidFire’s Oracle Validated Configuration, at a price point at or below the cost of traditional storage systems
      - SolidFire’s Quality of Service (QoS) guarantee; every copy receives dedicated all-flash performance, so IT admins can deliver solutions with confidence and maximize business efficiency

      Read more >
    • Top 5 Reasons to Deploy a Dedicated Database Security Solution Top 5 Reasons to Deploy a Dedicated Database Security Solution Sean Roth, Manager, Database Security McAfee Recorded: Aug 1 2012 6:00 pm UTC 40 mins
    • Protecting the valuable and confidential information stored within databases is vital for maintaining the integrity and reputation of organizations everywhere—not to mention ensuring regulatory compliance. However, many organizations still rely on security solutions with inherent limitations. Given the complexities of today’s database platforms and the sophistication of today’s cybercriminals, deploying a comprehensive and dedicated database security solution is a must. Here are five reasons why.

      Join this in-depth discussion on enterprise database security and learn how to (1) Overcome inherent limitations of perimeter security and DBMS security features (2) Circumvent major costs and operation challenges in taking your 'reactive' database security to an optimized practice and (3) establish real-time protection and continous compliance with ZERO downtime.

      Read more >
    • Embedding database analytics as stored external procedures Embedding database analytics as stored external procedures Wendy Hou, product manager and Mark Sweeney, sales engineer, Rogue Wave Software Recorded: Dec 10 2015 6:00 pm UTC 53 mins
    • Inside or outside, which is better? You know that embedding analytics in databases offer several benefits including security, performance, and enabling users to take advantage of the analytics more readily. But how do you do it?

      In this installment of our embedded analytics series we discuss embedding analytics using stored external procedures – an option provided by all major commercial RDBMS providers. These procedures are invoked in the same manner as internal stored SQL procedures but they run in a process space separate from that of the database itself. This separation can be advantageous in certain scenarios. In particular, if a data set is selected for analysis that pushes the limits of physical memory then the database is isolated from any issues that arise in running the analytics on this problematic data.

      In this webinar, you will see detailed steps on how to implement the analytics as a shared library using IMSL Libraries for C for illustration.

      If you missed part one of the series, watch the recording here: https://www.brighttalk.com/webcast/12285/164525

      Read more >
    • Three Proven Ways to Improve Database Performance with Intelligent Flash Arrays Three Proven Ways to Improve Database Performance with Intelligent Flash Arrays Narayan Venkat Recorded: Apr 12 2016 5:00 pm UTC 48 mins
    • When it comes to your database environments, do you have some applications that go "whole hog" on data and consume all your resources -- while other applications are starving? You are not alone. IT Managers choose Tegile flash storage solutions so they can easily drive multiple applications and multiple workloads -- and make sure no workload goes hungry.

      We will examine how to accelerate transactional workloads running on an Oracle DB, while reducing IO wait times.

      We will look at how organizations implement Quality of Service (QOS) standards to ensure that one application does not end up consuming all the available resources.

      We will share how to architect a storage infrastructure that delivers effective data protection without impacting business performance.

      Join us for this live webinar hosted by database experts.

      Read more >
    • MySQL: Scale Through Consolidation MySQL: Scale Through Consolidation Chris Merz, Database Systems Architect, SolidFire Recorded: Jun 16 2015 5:00 pm UTC 26 mins
    • Although it may sound like an oxymoron, the key to scaling a MySQL platform truly lies in consolidation of the physical storage layer. Whether you are running a dozen or a thousand MySQL instances, SolidFire provides a pathway to horizontally scale the storage layer, enabling capital and operational cost reductions, while virtually eliminating maintenance and replica deployment windows.

      Attend the Webinar to Learn

      - How SolidFire can guarantee storage performance, dynamically adjust storage resources on the fly, and linearly and non-disruptively scale your MySQL database storage infrastructure.
      - How you can reduce deployment times for MySQL replication slaves and reporting copies from hours to seconds.

      Join us in the discussion on the benefits of consolidating MySQL workloads on the storage industry’s only all-flash, scale-out, QoS-enabled storage system. With SolidFire you can provision, manage and clone production, reporting, dev/test and QA environments safely, all on the same array.

      Read more >
    • Database Surveillance and Protection: 3 Ways to Stop Hackers Database Surveillance and Protection: 3 Ways to Stop Hackers Cheryl O’Neill, Dir. Product Marketing, Imperva and David O’Leary, Dir. Security Solutions, Forsythe Recorded: Dec 2 2015 5:00 pm UTC 47 mins
    • Data thieves are opportunistic, looking for unprotected databases in the forgotten digital corners of your company. They are content to steal any data that lies within easy reach.

      Large companies are especially vulnerable. With hundreds or even thousands of databases spread throughout business units and across multiple geographies, it is only a matter of time until your unprotected data is accessed and stolen.

      Fortunately, it doesn’t have to be complicated, tedious or expensive to protect all of your sensitive data with a database monitoring solution. The right database monitoring solution can also provide visibility into data usage and simplify compliance audits.

      Join us for this webinar to learn:
      •Benefits of database monitoring over native audit tools
      •Factors to consider before investing in database audit and protection
      •3 specific ways to leverage database monitoring for improved security

      Read more >
    • Building High Performance Infrastructure for Databases Building High Performance Infrastructure for Databases Jon Toigo, Chairman, Data Management Institute; Sushant Rao, Sr. Director of Product Marketing, DataCore Software Recorded: Mar 29 2016 6:00 pm UTC 44 mins
    • Learn how to reduce latency and improve performance in your database environment without expensive hardware rip and replace.

      Regardless of your industry, chances are that databases form the core of your profitability. Whether online transaction processing systems, Big Data analytics systems, or reporting systems, databases manage your most important information – the kind of data that directly supports decisions and provides immediate feedback on business actions and results. The performance of databases has a direct bearing on the profitability of your organization, and these days, with 70 percent of respondents to one recent survey stating that IT must justify its budget by demonstrating real contributions to the bottom line, smart IT planners are always looking for ways to improve the performance of databases and the apps that use them.

      Many in the industry are pitching expensive flash storage peripherals to reduce latency and drive performance in database operations, but what is really needed is improvement across the I/O path – cost-effective improvements to infrastructure that will yield measurable gains not only in database processing, but also in the extract-transform-load workflows that define overall performance efficiency.

      Join us as industry analyst, Jon Toigo provides an overview of a strategy you can use to reduce latency and improve database performance without breaking the bank.

      Read more >
    • Not Your Father's Database Not Your Father's Database Vida Ha Recorded: Apr 7 2016 5:00 pm UTC 51 mins
    • This session will cover a series of use cases where you can store your data cheaply in files and analyze the data with Apache Spark, as well as use cases where you want to store your data into a different data source to access with Spark DataFrames. Here’s an example outline of some of the topics that will be covered in the talk:

      Use cases to store in file systems to use with Apache Spark:

      1. Analyzing a large set of data files.
      2. Doing ETL of a large amount of data.
      3. Applying Machine Learning & Data Science to a large dataset.
      4. Connecting BI/Visualization tools to Apache Spark to analyze large datasets internally.

      Use cases to store your data into databases for use with Apache Spark:

      1. Random access, frequent inserts, and updates of rows of SQL tables. Databases have better performance for these use cases.
      2. Supporting Incremental updates of Databases into Spark. It’s not performant to update your Spark SQL tables backed by files. Instead, you can use message queues and Spark Streaming or doing an incremental select to make sure your Spark SQL tables stay up to date with your production databases.
      3. External Reporting with many concurrent requests. While Spark’s ability to cache your file data in memory will allow you to get back to fast interactive querying, that may not optimal for supporting many concurrent requests. It’s better to use Spark to ETL your data to summary tables or some other format into a traditional database to serve your reports if you have many concurrent users to support.
      4. Searching content. A Spark job can certainly be written to filter or search for any content in files that you’d like. ElasticSearch is a specialized engine designed to return search results quicker.

      Read more >