Operators face dynamic challenges with how to store and access the ever expanding quantities of data available to the business. In this workshop you will learn strategies to better invest in your storage solutions and ensure you’re best combining computing power with big data to deliver value to end users.
Join this workshop to:
Donald (Rick) McMullen, Associate Director, HPRC, Texas A&M University
Rick McMullen currently serves as Associate Director of the High Performance Research Computing Center at Texas A&M University (TAMU) where he plans and develops research computing technologies and services and works with researchers to develop solutions for computing and data intensive problems.
Prior to joining the Center at TAMU Rick was Senior Director for Researcher Engagement and Development at Internet2. There he worked with researchers and IT organizations across the U.S. to design and implement infrastructure to support large scale scientific collaborations that span many institutions. He has also served as faculty, research lab director and center director for HPC units at several universities, and was a founding faculty member of the Indiana University School of Informatics. His research interests include high performance computing architectures, high performance research networking, cloud computing and storage architectures for research; provenance and life cycle management of information assets in scientific collaborations; long term data management and stewardship; performance of global scale WAN applications for eScience; systems and services for collaboration in e-Science, and Artificial Intelligence applications that support knowledge management and decision-making in scientific research collaborations.
Rick’s background is in Chemistry. He received a Ph.D. in 1982 from Indiana University.
Organizations have lost millions due to poor data management practices, but remain unaware of the root causes of their losses. Unless IT professionals can monetize these lost opportunities and their related costs, gaining executive-level approval for basic data management investments will continue to be difficult to justify. This workshop illustrates how to identify specific costs of poor data management practices using broad number of real crosssector examples. As oil and gas operators begin to see poor data management practices as the root cause of many of their problems, they will be more than willing to make the required investments in our profession.
Tangible take-aways will include:
• Understand how to excel at the process of assigning monetary costs to poor data management. Although it’s new, it’s not rocket science and with practice, organizations can become quite good at it
• Learn from real-world cross-sector examples of overcoming poor data management and how these readily transfer to other industries
• See how the process of adding up the hidden costs of poor data management is exactly the same as accumulating the hidden benefits of good data management
• Hear how many parts of the data logistics value chain can be improved to drive business value
• Explore the non-monetary costs as well as strictly monetary to build a better business case
• Discuss the important legal implications you need to be aware of
Peter Aiken, Past President DAMA-I, International Data Management Association
Peter Aiken is acknowledged to be a top data management (DM) authority. As a practicing data consultant, author and researcher, he has been actively performing and studying DM for more than 30 years. His expertise has been sought by some of the world’s most important organizations and his achievements have been recognized internationally. He has held leadership positions and consulted with more than 75 organizations in 27 countries across numerous industries, including defense, banking, healthcare, telecommunications and manufacturing. He is a sought-after keynote speaker and author of 10 books (latest: “Your Data Strategy”), multiple other publications, and he hosts the longest running and most successful webinar dedicated to data management (hosted by Dataversity.net). Peter is the Founding Director of Data Blueprint, a consulting firm that helps organizations leverage data for competitive advantage and operational efficiencies. He is also Associate Professor of Information Systems at Virginia Commonwealth University (VCU), past President of the International Data Management Association (DAMA-I) and Associate Director of the MIT International Society of Chief Data Officers.
It’s a huge challenge to get the ‘little data’ right before you can apply advanced analytics techniques and show true value to your business. One of the main issues is to ensure that the internal and external data you’re integrating is standardized and consistent.
Join this workshop to:
• Show the importance to the business of being able to standardize formats across assets
• Embed consistent standards internally to make better decisions based on accurate data
• Work with vendors to obtain standardized data that is more easily integrated into your data lakes
• Harmonize naming conventions for migration across exploration and production internally and across the industry
• Discuss how to strategize more effectively between data managers, geoscientists, other end users and standards organizations to agree the standards of the future
• Optimize data quality to lead to better analytics and decisionmaking further down the data value chain
Jay Hollingsworth, CTO, Energistics
Jay Hollingsworth is currently the Chief Technology Officer for Energistics.
Jay has a BS in Chemical Engineering at Tulane University in New Orleans. In addition, he attended graduate school in Computer Science at University of Texas in Dallas. As his career advanced as an Environmental and Process Engineer, he always focused on technical computing – first as a consultant and then for 20 years at Mobil Oil. At Mobil he was responsible for the data model of their FINDER global master data store and the suite of engineering applications in global use. After leaving ExxonMobil, he spent time in Landmark’s data modeling group before settling at Schlumberger. He spent 10 years at Schlumberger where he was responsible for the data modeling group and was the Portfolio manager for the Seabed database technology.
Jay is active in APSG, Energistics, PIDX and PPDM. He is a member of SPE and SEG and previously was a Technical Editor of the SPE Microcomputer Journal. He was a long-time member of the Board of Directors of PPDM and served as past president of APSG.