By Dr. Thomas Hill, Executive director, Analytics, Information Management Group, Dell
Big-data predictive analytics offer the promise of better outcomes and lower costs for healthcare organizations, effectively allowing a patient to access the expertise of thousands of experts gained through treating millions of patients. But successfully deploying the technology isn’t always easy. How you plan for and introduce analytics is critical to acceptance by stakeholders and a willingness to take action based on the knowledge you generate.
The Locomotive from Nurnberg to Furth: The fear effect of technology
Disruptive technologies almost always elicit initial skepticism and even fear. When the first steam locomotive prepared to make its run from Nurnberg to Furth in 1835, people were concerned about noise and pollution and feared that human physiology might not support travel at speeds over 20 mph. Other useful-but-disruptive advances have also generated initial fear.
In research published over 30 years ago, I demonstrated how a lack of a sense of control generated fear when personal computers revolutionized computing at universities. Interacting with a black box that generates results as if by magic, without giving any control to the end-user, will always generate distrust.
Projects will fail if analytics technologies are perceived to usurp personal judgment and control over final decisions.
Presenting analytics to stakeholders as a tool they can use will empower them and pre-empt fear. Demonstrate how these new tools help healthcare professionals to quickly evaluate risks, potential outcomes, and what-if scenarios. Predictions, recommendations and prescriptions derived from analytics need to come with reasons why a particular risk is indicated and how recommended actions will affect outcomes.
Avoid alarm-fatigue with unambiguous, actionable alerts
People cease to pay attention when alarms are too frequent or information doesn’t present clear options for action.
Enhance rather than add to existing processes, screens and alarming rules and ensure that important information is unambiguous, actionable and consistent with existing work flows. Think through where analytic results will be used, identify benefits and ROI and make certain that information is actionable. For example, Dr. John Cromwell implemented a system at the University of Iowa Hospitals and Clinics which sends real-time, actionable risk information to the operating room that is helping surgeons avoid post-surgical infections.
Don’t add to the onslaught of computer work
General surgeon Jeffrey Singer recently noted in the Wall Street Journal that rigid electronic health records systems promote “tunnel vision in which physicians become so focused on complying with the EHR worksheet that they surrender a degree of critical thinking and medical investigation.”
Analytics technology should be entirely hidden, yet deliver reliable information about risks, best next action and alternatives. Don’t require medical professionals to complete yet another computer screen.
Know the end point and how to measure results
One of the most important things to consider before embarking on any IT project is to clearly establish what the completed project would look like and how to measure success. Avoid projects that are attractive because of the “cool” technologies involved without clear definitions of success and ROI.
Think about what ideal results look like; who would use them and how; and how something of value would be created. Involve key stakeholders and end-users to reflect their concerns and perceived barriers to success. Once you know how success is exactly and operationally defined, everything else follows, such as where to look for what data, how results are delivered, what level of integration, training, operational changes or new resources/personnel are required.
Decide what data you need
Data acquisition and preparation is always the most time-consuming and difficult part of any advanced predictive analytics project. EMR systems are mostly closed and data from different sources and repositories use different labels and metrics for the same measurements. For example, reports from different laboratories may use different formats, scales and nomenclatures.
Before you begin, think through whether you need immediate ROI for a specific project or a more complete analytics solution. A project-specific approach allows you to go after low-hanging fruit using data that are easiest to get and integrate; a longer term approach, to support diverse projects, requires building a robust general data-analysis warehouse with a Master Patient Index, terminologies and translation logic, and incorporates adapters to allow integration with relevant data sources.
Finally, there is governance, though ideally this would come first. Often overlooked, governance is important for two major reasons. First, many projects initially succeed, but then fold after the project champion departs, leaving nobody who knows and understands how it all works and where the data are. Second, regulatory oversight and scrutiny will become important when analytics affect real patient outcomes.
A role-based system with lifecycle management, version control, audit logs, approval processes, etc., will solve the issue of departing champions as well as the need to document how predictive models were built, validated, approved and implemented. Good examples of this can be found among pharmaceutical and medical device manufacturers, which have for years incorporated mature governance features to meet these challenges.
When advanced analytics projects fail and smart professionals decide not to leverage smart technology to improve outcomes, the cause is often project leaders who ignore critical steps when planning and implementing such systems. Future healthcare will inevitably rely greatly on advanced predictive and automated analytics to help health care professionals produce better patient outcomes more reliably and effectively. Getting it right from the start will create that future faster and benefit everyone.
I’ll be at HIMSS, April 13-15 in Chicago, and leading a tweet up discussion on the future of population health management on April 15 at 11 am. I look forward to hearing your thoughts on this topic.
About the author
Dr. Thomas Hill is Executive Director for Analytics at Dell’s Information Management Group. He joined Dell through the acquisition of StatSoft Inc. in 2014, where he was Senior Vice President for Analytic Solutions for over 20 years, responsible for building out Statistica into a leading analytics platform. He was on the faculty of the University of Tulsa from 1984 to 2009, where he conducted research in cognitive science. Dr. Hill has received numerous grants from the National Science Foundation, National Institute of Health, Center for Innovation Management, Electric Power Research Institute, and other institutions. Over the past 20 years, his teams have completed consulting projects with companies across various industries in the United States and internationally. Dr. Hill is the (co)author of several books, most recently of Practical Text Mining and Statistical Analysis for Non-Structured Text Data Applications (2012) and Practical Predictive Analytics and Decisioning Systems for Medicine (2014).
At the risk of aging myself, I’ve probably attended 70 or so TDWI conferences and executive summits, but the recent TDWI Las Vegas was different. It marked the organization’s 20th anniversary and reinforced the importance of analytics, which was a big topic among attendees, speakers and vendors.
As a TDWI faculty member for the fourth consecutive year, I taught a whole-day class on social analytics, which focused on driving business values with big data. But more on that subject in my next post. In this blog, I’d like to step back and address changing analytics dynamics.
This vital area has come a long way from its roots in data management, reporting and BI. I reminisced with TDWI president Steven Crofts about all the changes that have taken place over the years. It was fun to remember when TDWI, aka The Data Warehouse Institute, was about innovations in data processing and warehousing. Fast-forward two decades: We’re talking about big data, social analytics, machine learning and cognitive computing.
The same quantum leap holds true when talking about analytics, which has evolved into highly automated, somewhat transparent solutions for ingesting, integrating and leveraging vast amounts of information. As analytics mature, organizations of all types are looking at how they glean greater business value. Some industry segments, like pharmaceutical, manufacturing and retail, are ahead of the curve because market disruption has led to adopting new analytical capabilities and advanced workloads to produce real-time fraud and risk analyses as well as quality control and other complex insights.
Many companies I spoke with are in the midst of retooling their analytics foundations to be more successful. Others are making bold moves. The manufacturer of an industrial French fry maker is using sophisticated analytics and sensors to better monitor equipment vitals (e.g., the filter is dirty, heating element isn’t hot enough, etc.). In doing so, they can assure their restaurant customers that their equipment is performing optimally.
It doesn’t stop there. They also want to listen to the social signals of their customers’ customers—the folks eating fries—to better understand satisfaction levels through trend analyses. If they learn through social analytics that customers complained about substandard fries at their customers’ establishments, they could proactively help the restaurant take action. As a result, this company will be able to differentiate themselves by helping their customers before something hurts their brand. How cool is that?
Regardless of where companies are in the analytics adoption curve, there are major drivers accelerating change across the entire data landscape. As business analytics mature, there will be continued movement along these four pressure points:
New types of data can serve a wider community of users who will want to mix, match and mash up information to yield value in responding to business needs. This will lead to a more mature mantra, from any company looking to innovate: “Put the right data, on the right platform, at the right time, for the right workload.”
In my next post, I’ll offer more insights from TDWI by sharing experiences and more real-world examples from my class, “Social Analytics: Driving Real Business Value with Big Data.” Until then, drop me a line at Shawn.Rogers@dell.com to share your mantra for business analytics.
Innovative companies are adopting advanced analytics to take action and match the speed of their business. This is especially true in the world of manufacturing where complicated process driven activities benefit greatly from smarter, faster analytic insights and actions. Collecting and analyzing process data from sophisticated manufacturing processes requires a flexible and agile infrastructure that supports a wide variety of disparate data sources often spanning sensor and machine sources that are combined with instrumentation and testing data, machinery and production data, customer and market data, supply chain information, 3rd party benchmarks and a wide assortment of system data.
In the case of Pharmaceutical manufacturing systems can include laboratory information management systems (LIMS), manufacturing execution systems (MES), enterprise resource planning systems (ERP), supervisory control and data acquisition systems (SCADA) and last and perhaps most difficult to manage and integrate; paper documentation. Bringing this data together in an action oriented manner requires accurate planning and solid project management principles.
Sanofi Genzyme the 3rd largest pharmaceutical company in the world is embracing this challenge and utilizes the following criteria when planning process driven analytic projects.
Genzyme has built an agile architecture named APEX to bring data together to support advanced analytics. APEX is designed to bridge decentralized, heterogeneous data sources and provide a centralized, secure and validated data layer for analytics. Genzyme recognized early on that they needed to leverage real-time data as well as process oriented data in order to get the analytic insights they desired, APEX accommodates these functions through the use of a traditional data warehouse working in tandem with a process oriented data store. Genzyme also took into account how important data lineage, integrity and validation are to their compliance initiatives and maintains strict control over all data sources throughout the analytic process.
The framework that Genzyme has developed creates value across the company as it supports a repeatable and scalable environment designed to meet the evolving needs of their manufacturing processes. In the end, the following themes are crucial to the success of this and future projects.
For more details on process driven analytics for manufacturing and the Genzyme story, watch this webcast: "Business Analytics in Regulated Manufacturing".
Statistica users > Make your voice heard in Dresner's 2015 Wisdom of Crowds® survey by March 20 and you may qualify for a complimentary copy of the study findings.
We invite you to help represent Dell Statistica in this 6th annual examination of the Business Intelligence (BI) marketplace that covers BI deployment trends and related areas including cloud BI, collaborative BI, advanced and predictive analytics, cognitive/artificial intelligence, data storytelling, and an all-new section on enterprise planning. Statistica users in all roles and throughout all industries are invited to contribute their insight, which should take approximately 20 minutes. Take the survey yourself and share the link with colleagues and customers: www.dresnersurvey.com/TR8DLYN. We appreciate your support and thank you for completing the survey by March 20, 2015.
by Danny Stout
Manufacturers have a long history of successfully employing data - big data - to help make important and insightful business decisions. According to a recent article in CMS WiRE by Joanna Schloss, a subject matter expert specializing in data and information management at Dell, early adoption means the industry is set to be a primary benefactor of the big data analytics boom.
Schloss submits that as an early adopter of big data, with a ubiquitous presence in society, and unparalleled access to data collection, the manufacturing industry has a plethora of new revenue streams available to it. In her article, she outlines three:
The potential of big data is now a reality for every industry. Manufacturing just happens to be positioned to immediately begin delivery and reaping the rewards.
You can read all of Joanna's insights here.
by Uday Tekumalla
Predictive analytics are used by companies for everything from customer retention and direct marketing to forecasting sales. But at the University of Iowa Hospitals and Clinics, predictive analytics are serving a far more noble purpose - to decrease post-surgical infections.
By utilizing a number of different data points that were gathered from 1,600 patients, each of whom has had colon surgery performed at the University's hospitals, the medical teams have dramatically reduced the number of patients inflicted with post-surgical infections. In fact, over a two-year period, those infections were slashed by an impressive 58-percent.
That is an impressive feat. There are, after all, a multitude of variables that can lead to an infection. This analysis considered several different data points - patients’ medical history, data from monitoring equipment, data from national registries, and real time data collected while the surgery is being performed like blood loss, wound contamination, etc. The University built predictive models using Dell Statistica predictive analytics software to achieve these impressive results. Running this analysis allows the hospital to determine a patient's risk level for post-surgical infection, providing the medical team with clear insight into the medications and treatment plans to employ going forward to minimize the risk of infection.
Along with providing better patient outcomes the University of Iowa also has likely reduced medical costs. This is an exciting example of the potential of predictive analysis. Learn more about the university's results here.
Whittaker finds it increasingly difficult to talk about analytics without also talking about the role of cloud.
In her latest CMSWire article, Schloss describes three reasons why manufacturers are uniquely poised to be primary benefactors of the big data analytics boom.
Rogers also reacts to the recent Enterprise Management Associates (EMA) research study, reflecting on the competitive advantages that can result from “cloud first” thinking
At Dell’s recent Big Data 1-5-10 event, I kicked off my introduction by saying my goal is “to help customers use 100 percent of their available data all the time.” This remark caused a few heads to turn, and later prompted Jeff Frick, GM of SiliconANGLE and host of theCUBE live interview show, to ask me for more insight into what he called a “provocative statement.”
Shouldn’t we all be driving toward collecting, analyzing and utilizing data to its fullest? As I explained to Jeff, we’re nowhere near ready to deliver all the data, all the time, but we need to make steps in that direction so we’ll be ready to clear the hurdles and take full advantage of opportunities as they become available.
Technology is still siloed, unfortunately, which makes it difficult for people to build out all the analytical models today that can deliver answers to their most critical questions. Structured and unstructured information isn’t analyzed together, which creates another barrier to getting one single view of the truth. Another barrier: people doing the analytics address very specific, often narrow areas of focus.
Currently, most companies use only a subset of their data for a very specific purpose. But, you can discover so much more if you step back and take a larger view. For example, instead of only looking at revenue trends over the past 12 months, what could be learned if you look more broadly at the health of your company’s customer base or the social factors driving trends and behaviors that either accelerate or moderate a drop or move in your business?
Delving deeper into the data delivers so much more insight. At the University of Iowa Hospitals and Clinics, for instance, Dell Statistica is used to pull data from a wide variety of data sources to help lower the rate of infection for surgical patients. As reported in the Wall Street Journal’s CIO Journal, the University of Iowa takes information from patients’ medical records, and surgery specifics, such as patient vital signs during operations, to predict which patients are face the biggest risk of infection.
Armed with this valuable insight, doctors can create a plan to reduce the risk by altering medications or using different wound treatments.
Thanks to the evolution of analytics, other organizations will be able to follow University of Iowa’s lead in more fully utilizing their data. We’re at a tipping point—compute cycles now are affordable enough and can keep pace with data proliferation while plentiful bandwidth and cloud services make ubiquitous data access a reality. Today’s infrastructures enable us to do things that weren’t possible five years ago.
While environments now are ready to accommodate a more holistic view and broader conversations about data, most companies are just starting to buy-in conceptually. Sure, companies want access to all their data, all the time, but most folks I speak with see this as an aspirational goal still to be achieved. When it comes to the here and now, they’re pretty pragmatic and taking the first steps to realizing their data’s full potential.
Since focus is the hallmark of success, I recommend putting customers first. Start by taking all the steps you can to get all the data on your customers. Then, gather all the data on your product areas, supply chain, manufacturing, etc. In each respective area, there likely will be a dozen different data sources that are interconnected and interrelated. For instance, in compiling data on customers, you’re likely to encounter exposed interfaces that take you to product, which can be integrated with manufacturing, and so on. It’s kinda like assembling LEGO blocks or deciphering fractal patterns as all the data elements are nested and interwoven.
Another major step is determining how best to empower your data analysts by providing them with the right tools for producing everything from simple reports and visualizations to complex analytics. But don’t stop there. If your data is locked away and only useful for PhD modelers and data scientists, you’ll only solve part of your problems. Getting data into the hands of your subject matter experts and line-of-business decision makers is crucial because they too must be empowered to build their own analytical models.
The day when employees become their own data analyst isn’t too far out on the horizon. Once everyone has access to all the data, all the time, they can create their own hypotheses. Training your employees to think more analytically is something every organization should already be doing to stay ahead of the curve.
What steps are you taking to ensure your company gets the most from all its data, all the time? Drop me a line at firstname.lastname@example.org to exchange ideas on how to unlock the power of your data.
This blog will be where members of the Statistica team--a dedicated group of trouble-shooters, project managers, subject matter experts, sales engineers and thought leaders--can freely connect with each other and with members of the broader Statistica user community.
We are excited to share our thoughts about issues and trends and products within the dynamic realm of statistics, business analytics, big data, predictive solutions, and information management. This community is also intended to facilitate your use and comprehension of the award-winning Statistica analytics platform, through the eventual sharing of blogs, webcasts, media, whitepapers, tools, solutions, and more. We appreciate your patience as we roll out relevant content here, and we recommend you visit the excellent resources already available via the Statistica product page. Meanwhile, come back and come back often to learn what’s new in the world of Statistica.
Or better yet, join the community if you’re not already a member and visit our main blog page to email subscribe to this blog. (The link is under "Options.") Feel free to give us feedback and suggestions by posting your comments on a blog post or a how-to wiki article. We want to provide tools and solutions that are practical and helpful for you.
-- The Dell Statistica Team
As I took in the sights and sounds of Dell World earlier this month in Austin, and more importantly, soaked in the palpable and almost inescapable buzz surrounding Dell’s information management capabilities, I kept coming back to the same thought: What a difference a year makes.
This year marked the fourth annual iteration of the event, and the third I’ve had the pleasure of attending as a member of the Dell team. Since its inception, Dell World has been nothing short of a marquee industry event, its agenda lined with preeminent speakers, informative sessions, rich networking opportunities and world-class entertainers. So it spoke volumes – about both the progress Dell has made as a leading provider of information management solutions and the growing importance of information management to customers – that big data, data analytics, and the ability of companies to transform data into insights was front and center throughout the event.
There’s no greater endorsement of an IT trend’s importance to Dell as a company and to the industry at large than when Michael Dell makes it the focal point of his opening keynote address at Dell World, and this year, that address focused on “The Power and Promise of the Data Economy.” As Michael made crystal clear in his speech, the road to competitive advantage in today’s economy is paved with data, and organizations that seize the opportunities afforded by the data economy will be the ones that succeed.
A huge part of doing that, of course, is leveraging modern advanced analytics to better understand your businesses, predict change, increase agility and control critical systems. So it should be no surprise then that Michael’s keynote was followed by a fantastic panel discussion on modern data analysis featuring Dell Software’s own Dr. Thomas Hill, one of founding minds behind the creation of the Statistica advanced analytics platform Dell acquired from StatSoft earlier this year. My opinion is biased, of course, but I thought Dr. Hill was hands down the star of that particular show.
But it wasn’t just the keynote. Information management was everywhere at Dell World. No less than 10 sessions during the event focused on the need to analyze, integrate and manage data and information, including in-depth technical sessions on realizing the power of connected intelligence with Toad for Oracle and performing customer churn analysis with Statistica, as well as broader discussions on the importance of making data the lifeblood of your business. I personally had the pleasure of dropping by the Dell World News Desk to discuss the importance of turning data into insights, and our vice president and GM Matt Wolken had a similar dialogue with our good friends at theCUBE.
Speaking of news, it wouldn’t be Dell World without some major announcements, and in that regard, information management stood out. This year, we announced two major initiatives that will drive our analytics capabilities forward. First and foremost, we announced the start of a new collaborative effort with Microsoft aimed at enabling customers to perform powerful predictive analytics capabilities within a scalable, cost-effective hybrid cloud environment. The effort enables customers to use Statistica in tandem with the Microsoft Azure Machine Learning Service. In addition, we also announced that we are upgrading Statistica with enhanced big data capabilities through integration with Kitenga.
Of course, while we’re thrilled to see information management taking on a growing importance across Dell, and more importantly, across our customer base, we really are just getting started. We have many new initiatives we’ll be unveiling – and major announcements we’ll be making – in the coming months, and by the time Dell World 2015 rolls around, I have no doubt that I’ll again marvel at what a difference a year makes.
If you haven’t followed Dell’s growth in enterprise software, you’ll be surprised to learn that it’s making enough small, medium and large companies happy to become a $2 billion-per-year business.
Dell Software is gaining momentum as a complementary offering to Dell’s computing hardware. Through a strategy of steady development and acquisition, software has become a huge business for Dell, with the Information Management Group a strong contributor.
In an interview at Dell World early this month, Matt Wolken, vice president and general manager of Dell’s Information Management Group, described how the group has grown and changed over the last few years. Its strategy and approach have been to keep up with its customers’ ever-increasing need for tools for databases, analytics, integration and development, while supporting environments as varied as Oracle, DB2, SQL Server and Sybase, plus dozens of different data sources.
You can see the entire interview in this video, which includes these highlights:
The interview offers insight into the role of the Information Management Group and some of the upcoming moves you can expect from us in database tools and analytics.
In the world of big data, are you an analytical producer or an analytical consumer?
Don’t worry; one isn’t necessarily better than the other. In fact, analytical producers and consumers need each other, as we explain in our white paper, “Big Data Analytics in Action.” Analytical producers use data mining, predictive analytics, machine learning and natural language processing to produce models, which analytical consumers use to drive the business forward.
Another important topic we cover in the paper is the difference between big data and analytics, a difference that sometimes gets blurred in the conversations between IT and business managers.
Big data is about the storage, speed, performance and functionality of hardware and software pulling information into your organization. Analytics is about enabling informed decisions and measuring impact on your business. Big data drives innovation in analytical technologies, and this white paper introduces you to the most prevalent of those analytical technologies, as shown in the diagram:
The white paper also contains concrete applications of analytics and big data in marketing, finance, healthcare, pharmaceuticals and manufacturing, along with a series of tips to ensure success in your next analytics project.
Get your copy of “Big Data Analytics in Action” and keep an eye out for more content on our Information Management blog in Dell TechCenter.
What if predictions in healthcare could be as personalized and accurate as they are on Amazon or Netflix?
A new book, “Practical Predictive Analytics and Decision Systems for Medicine,” helps organizations move in the right direction by providing a step-by-step guide to applying predictive analytics to healthcare.
The book’s team of authors includes two members of the Dell Information Management Group who joined Dell following the company’s acquisition of StatSoft: Dr. Thomas Hill is the executive director for analytics in the Information Management Group; Dr. Gary Miner is the senior analyst and healthcare applications specialist in that group. StatSoft, which created Statistica software, has long provided analytics capabilities for the healthcare industry.
In a recent Q&A with Hill, Miner and lead author Dr. Linda Winters-Miner, these writers highlight the promise of analytics for healthcare. “If you could take the so-called Amazon experience of highly personalized, accurate profiling of what someone is probably going to do next, and turn that loose on healthcare, common sense tells you there’s an incredible amount to be gained,” says Hill.
Some forward-looking hospitals are already using predictive analytics and decisioning systems to achieve key healthcare goals, such as reducing infection risks. “As a surgery is taking place, they’re inputting real-time data into a decisioning system,” says Hill. “The surgeon can then use [the insights generated] to make on-the-spot decisions about how to go about closing and treating the incision in order to reduce the risk of infection.”
Despite the great promise for analytics, the industry has a long way to go. “I don’t think we’re at a point where we can say most organizations are even dabbling in analytics for predicting individualized diagnosis and treatment,” says Miner. “There are certain exceptions, but most organizations are 15 years behind the time.”
This book could provide valuable guidance to help organizations understand how analytics is transforming the healthcare industry – from patient, to payer, to provider. It could also help invigorate the profession. “I hope it gets people excited and hopeful about their ability to change the industry,” says Winters-Miner.
Read the full Q&A with Hill, Miner and Winters-Miner on the Direct2Dell community site, and check out the book.
Interested in exploring Dell Statistica software? Try Statistica for free!
by Joey Jablonski
Recently, insideBigData highlighted some exciting developments announced at Dell World: a new series of solutions offered by Dell to help enterprises make the most of their big data by enabling improved customer relationships and transforming data into actionable business insights. The comprehensive, digital business services help customers redesign processes to improve operational efficiency and better engage customers.
The customer solutions focus on three important areas:
To meet these goals, Dell Services has launched Dell Digital Business Services. Dell DBS helps customers better understand their end users’ preferences and transform business processes using digital technologies. It accomplishes this by consulting with them to help assess their needs and develop a plan to address those needs by leveraging digital technologies, while offering access to Dell's comprehensive end-to-end portfolio. Additionally, Dell DBS has built strong relationships with several partners across the digital transformation technology areas. Among them are Kony, Apperian, Cloudera, and Informatica.
Complementing these efforts, Dell Software has also announced a new collaborative effort with Microsoft. This collaboration will allow customers to perform powerful predictive analytics capabilities within a scalable, cost-effective hybrid cloud environment.
Additionally, Dell Software launched a new big data analytics module within its Statistica advanced analytics platform. This new module embeds Kitenga's big data analytics capabilities directly into the Statistica user workflow. Statistica users can now crawl various forms of unstructured data and combine it with large-scale structured data to gain a richer understanding of customer behavior and market opportunities.
You can get greater detail and learn more about these Dell World announcements at insideBigData.
You want to run your business on data and deliver results right now. Who doesn’t? But to achieve these goals, you need to bring together data from disparate sources and perform in-depth analysis. How can you achieve agile data integration while helping to ensure the quality and consistency of data at the same time?
In this Dell on-demand webcast, Philip Russom, research director on data management for the research firm TDWI, outlines three pillars for agile data integration. By building on these pillars, organizations can deliver data integration solutions sooner, better align solutions with business goals and ultimately free up resources to develop more solutions.
Pillar 1. Enable self-service data integration
The process of gathering requirements for data integration can be time-consuming, but according to Russom, it doesn’t have to be. Providing technical and business teams with self-service tools that incorporate data profiling, data discovery and data visualization capabilities can accelerate the process. Those tools help teams record requirements as they work, helping to eliminate the weeks or months of interviewing various stakeholders.
Pillar 2. Capitalize on rapid data set prototyping
Creating data set prototypes early in the data integration allows you to sustain high data quality and avoid issues down the road. Fortunately, many self-service tools enable rapid prototyping of data sets. Technical and business team members can conduct simple data extractions and transformations to produce prototypes quickly and easily.
You can achieve agile development and delivery of data integration solutions while also addressing responsible data access and preparation requirements that help ensure the quality and consistency of data. Pillar 3. Employ data stewardship and facilitate collaboration
Data stewardship plays an important role in successful data integration. A data steward is a member of a business group who helps ensure data management efforts meet business requirements and who can deliver a rapid return on investment. When data stewards collaborate with technical staff on data integration projects, organizations can better align technical work with business requirements. The result is faster development of data integration solutions and fewer overlapping tasks that can delay project completion.
Having the right tools can make it easier for organizations to build on these pillars. In the same webcast, Peter Evans, a business intelligence and analytics product evangelist at Dell, highlights Dell software for information management that can help organizations take advantage of these pillars and achieve successful, agile data integration.
Ready to learn more? View the data integration webcast.
Do you ever feel that something is thwarting your data discovery efforts? Or are you an IT manager who thinks that business users just don’t understand data governance? Whichever side you’re on, you’re not alone. Business users and IT managers in plenty of other organizations share those sentiments.
Like many companies, yours probably has a much greater volume and variety of data at its disposal than ever before. Your business users are eager to explore and analyze that data so they can generate new insights to help your company capitalize on opportunities and meet its business goals.
But something is standing in their way. As business users see it, that something is IT. Your IT department is tasked with data governance — making sure data is accurate, complete and secure. And unfortunately, data governance processes and policies impose restrictions that can hinder business users’ ability to freely explore data and get the answers they need when they need them.
Neglecting data governance is not an option. But if business users have trouble accessing or using data, your company could miss some vital opportunities. So, what’s the answer?
A new approach: Collaborative data governance
According to the Aberdeen Group, implementing a “collaborative data governance” approach can help eliminate the conflict between IT and business groups, and improve data discovery. Collaborative data governance opens a dialog between IT and business users. IT gains greater visibility into user needs, and business users learn the proper procedures for accessing data so they can work with IT to optimize their data discovery experience. The groups share the responsibility for governing data and finding new ways to maximize its value.
In its 2014 Business Analytics survey, Aberdeen found that organizations with collaborative data governance enjoy several important benefits:
Creating a culture of collaboration between IT and business users can yield benefits that extend beyond specific data discovery requests. For example, as Aberdeen suggests, collaboration can help prevent data silos from forming and facilitate more enterprise-wide access to data, while ensuring data is accurate and secure.
Beyond encouraging collaboration, having the right BI tools is also essential. The Aberdeen survey showed that 72 percent of collaborative organizations use data management and data quality tools. Those tools help simplify data discovery while streamlining governance.
Whether you’re a frustrated IT administrator who’s tired of having to say “no” to new requests or a frustrated business user who’s tired of wading through bureaucracy to access data, it’s time to reach out to the other side. Working together, you can increase the value of data while maintaining the levels of governance your organization requires.
Learn more about collaborative data governance and the advantages you can gain from this approach by reading the Aberdeen report, “Collaborative Data Governance: Peeling the Red Tape off Data Discovery.” If you’re looking for more insight on how to bridge the divide between IT and business groups, read Peter Evans’s recent blog post, “Bringing Business and IT Together for Better Business Intelligence.”
Last month, I authored a blog on Direct2Dell outlining the great momentum we’ve seen in the advanced analytics space since acquiring StatSoft back in March. If you haven’t read it yet, it’s a good way to understand the synergy that exists between Statistica and the broader Dell portfolio, and it outlines the many integration points Dell’s customers can look forward to in the months ahead.
But don’t let all of the exciting things we have planned for down the road obscure the fact that Statistica is already one of the market’s leading advanced analytics platforms, and that right now – today – it should be at the top of the list for anyone looking to invest in an advanced analytics solution in order to facilitate better, faster decision making. Here are 10 of the many reasons why:
1 – Predictive capabilities. When it comes to performance and capabilities, Dell Statistica takes a back seat to no one, offering a comprehensive set of data mining, predictive analytics and data visualization capabilities that companies need to better understand their businesses, predict change, increase agility and control critical systems.
2 – Ease of use. User surveys confirm that simplicity has long been a hallmark of the Statistica platform. It’s simple to install and administer, features a user friendly interface, and doesn’t require proprietary coding skills.
3 – Low TCO. In keeping with Dell’s heritage as a company that makes innovative solutions attainable to the masses, Statistica is a cost-effective solution that fits the budgets of small and mid-sized companies.
4 – Scalability. Just because it’s simple and affordable doesn’t mean it’s not scalable. Statistica offers the scalability and high-performance technology needed to work on the enormous datasets common in enterprise accounts.
5 – Integration. Statistica easily integrates with existing data stores, including traditional RDBMS, distributed file systems, and data services, as well as emerging data storage technologies. It also embraces open standards, seamlessly integrating with R, PMML and other industry standard protocols.
6 – Role-based support. Statistica was built to accommodate users of differing skills and roles, all of whom collaborate across different parts of analytic operations. The platform offers personalization options that ensure user groups have access only to the data, interfaces, and workflows relevant to their areas of responsibility. In this way, Statistica deftly manages business processes and increases data security.
7. Real-time monitoring and reporting. Easy-to-use dashboards and Live Score™ functionality makes it possible for organizations to support of on-demand business needs, even when faced with thousands of simultaneous data inputs from line-of-business applications.
8 – Vertical and LOB prowess. Statistica is designed to meet the specific needs of customers across a variety of verticals and lines of business. It has a long, successful track record with customers in manufacturing, healthcare, pharmaceuticals, banking, and marketing. Statistica is used to aid fraud detection, personalize marketing offers, and improve patient outcomes in the healthcare industry.
9 – Track record. Speaking of track record, Statistica has been around as long as Dell itself, and over the course of its 30-year history, has been trusted by world-class businesses and academia alike.
10 – Validation. Statistica has repeatedly been recognized as one of the leading advanced analytics platforms by top industry analysts, most recently having been recognized by both Hurwitz & Associates and Dresner Advisory Services for the significant value it delivers customers.
With more companies – including more of your competitors – than ever before now leveraging predictive analytics to gain an edge in the marketplace, there’s never been a better time to get to know Statistica. We have a team of experts ready to help you better understand how predictive analytics can drive your business forward.
Just interested in chatting about advanced analytics? Drop me an email at email@example.com or send me a note on Twitter at @johnkthompson60.
There’s no doubt that big data can create big opportunities. From online retailers and telecommunications firms to financial service companies and government agencies, organizations across a diverse array of fields recognize that analyzing the large volume and variety of information available to them can play a big role in achieving their goals. Insights drawn from big data can help organizations enhance the customer experience, increase internal efficiencies , improve fraud detection, identify new growth areas and more.
Given the potential benefits, what’s preventing more organizations from capitalizing on big data today?
In some cases, it’s a matter of finding the right tools — and organizations often need a fairly wide range. They need tools for data integration and data management as well as the analytics and business intelligence tools that will ultimately generate the new insights. At the same time, they need a robust infrastructure with the scalable capacity to accommodate growing collections of data plus the performance for delivering timely insights.
For midsized organizations in particular, a lack of in-house expertise can slow or stall big data analytics projects. These organizations might need assistance integrating data sources, implementing analytics solutions and deploying the infrastructure required to support big data.
Offering a unique combination of solutions
According to a recent solution brief published by Enterprise Strategy Group (ESG), Dell is uniquely qualified to address these challenges. “Dell is one of the very few companies possessing the right ingredients to really reshape the big data and analytics market,” writes Nik Rouda, senior analyst for ESG. By working with Dell, organizations can get the software, hardware and services they need from a single vendor.
Changing the marketplace
By offering a comprehensive, integrated portfolio of software along with hardware and services, all from a single vendor, Dell is bringing the benefits of big data and analytics to a wider range of organizations. According to Rouda, “The new focus on building a complete technology stack for midmarket and departmental environments will be well received by a segment of the market that has been underserved.” Big data will not have to be just for big companies any more.
StatSoft has been a part of Dell for several months, and this is a good opportunity to gauge the fit.
If you’ve followed Statistica, you’ll be glad to know that we’ve retained the Statistica brand, and we’re in Dell Software’s Information Management portfolio under Business Intelligence (BI), alongside Toad BI Suite, Boomi and Kitenga.
If you’re not yet familiar with Statistica, I’ll explain why that’s such a good place for us to be.
Once you’ve collected enough data, you’ll want to do three things with it: mine it, visualize it and use it to predict the future. Advanced analytic tools are in use across all industries and business functions. Advanced analytics allow you to:
Statistica is known for enabling its customers to find patterns in huge amounts of data and make decisions based on those patterns. As Gregory Piatetsky of the KDnuggets site pointed out, Statistica rounds out Dell’s portfolio of information management tools, as depicted in the image below.
As a part of the Information Management product portfolio, Statistica gives Dell competitive advantages in emerging areas of advanced analytics. Thomas Hill, Ph.D., executive director of analytics for Statistica, talked about several of those areas in a KDnuggets interview:
That’s the big picture for big data as Statistica continues growing inside Dell. Find out more about Dell Statistica, and subscribe to the Information Management blog on TechCenter for more ideas on putting advanced analytics to work for your organization.
In no uncertain terms, the use of predictive analytics is crucial for SMBs to remain competitive. A recent survey revealed that companies that rate themselves substantially ahead of their peers in their use of data are three times more likely to rate themselves as equally ahead in financial performance. Of those surveyed, 36 percent reported they have measured positive top- and bottom-line impact from their predictive analytics efforts.
Which areas of marketing can benefit from predictive analytics? Almost every area. Here are a few examples:
Despite the growing awareness of these benefits, predictive analytics remain somewhat of an enigma in terms of implementation into marketing efforts. Take, for example, retail outlets. As a marketing manager, imagine a photograph of a person with a shopping cart walking down an aisle packed with products. What would be the most interesting analytical data one could get out of this? Looking at the shelves to see what products are depleted for forecasting? It is pretty obvious that one can track inventory using sophisticated supply chain management techniques, but that’s not predictive analytics.
I think it’s much more interesting to look at the shopper’s receipt. By looking at receipts, we can begin to assess what the shopper’s needs and wants are. What items and quantities does the shopper typically buy? Is there a preference for self-checkout lanes or full service? Is there a time of day preference? Is there brand loyalty or price sensitivity? Are payments done with cash, by debit card, or by credit card with a reward incentive? We may even be able to assess the shopper’s attitude toward privacy—is the name, phone number or address printed on the receipt?
Analyzing this data enables marketers to make very useful predictions about what this shopper may do in the future, and as the number of receipts for this shopper increases, and as the receipts for all shoppers are aggregated, the ability to make predictions about individual and group behavior increases, enabling highly targeted marketing campaigns. The usefulness of predictive analytics is undeniable for businesses of all sizes.
So how do you get started? First, you need to recognize that predictive analytics is not where you will start your analytics journey. The first step is always to get “street smart” about your data.
What should you be collecting and how should you do it? How should you be modeling data? Once you understand this, you can begin to make incremental investments in your infrastructure to support data integration—bringing all the different data sources together—and then look for an analytics software solution in order to start creating the algorithms you’ll need for prediction.
One thing to keep in mind: Many of the available solutions and services were built for large enterprises, meaning they don’t necessarily meet the needs of SMBs in terms of scalability and budget. It is important to take a careful look at vendors who specialize in medium-sized business needs.
Interested in deeper discussions about your specific analytics needs? Feel free to reach out to me at Shree_Dandekar@dell.com or on Twitter at @Shree_Dandekar.