So far, 2016 has been a year of product innovations, market validations and customer successes—a trifecta in most marketing circles. It is, of course, the latter part of that trifecta that matters most. To see organizations such as Shire, Sanofi, Danske Bank, University of Iowa Hospitals and Clinics, and even Dell itself transform their businesses with the help of Statistica is more gratifying to our team than I could ever put into words.
"We have reduced the time we spend on models up to 50 percent with Dell statistica. Our Development process is much leaner and smoother compared to what it was before," said Jens Christian Ipsen, first vice president, Danske Bank.
Those type of customer successes mean more to us than any award or third-party validation out there.
But that doesn’t mean we don’t take third-party validation seriously. And we’re absolutely thrilled to share Dell Statistica’s ranking in the seventh annual “Wisdom of Crowds Business Intelligence Market Study,” published recently by Dresner Advisory Services.
A well-respected source for industry insight, Dresner’s annual flagship study has been expanded to encompass BI and analytics user trends, attitudes and plans for the next three years. More than 1,500 survey respondents weighed in on major business drivers, objectives and overall industry performance. They also ranked 28 individual solutions from a field of legacy vendors, analytics frontrunners and Silicon Valley start-ups.
Dell Statistica was rated an overall leader in customer experience and vendor credibility while scoring substantially above the overall sample based on 33 criteria. We also garnered best-in-class for technical support product knowledge and consulting product knowledge. Perhaps most rewarding was our perfect recommend score from study participants.
Interestingly enough, Dell’s strong showing correlated with a reported organizational shift toward operations driving analytics efforts, outpacing executive management for the first time. While “making better decisions” retained its hold as the top objective, “improving operational efficiencies” was a close second. This tells us BI and analytics are moving from the exclusive domain of the C-suite into lines of business where citizen data scientists are embracing analytics to solve tough business problems.
The ability to empower traditional and citizen data scientists is a key differentiator that likely helped Dell earn a leadership spot for customer experience. After all, we’ve spent three decades evolving our robust predictive analytics solution, while continually enhancing ease of use. Dell Statistica version 13.1 features new functionality that makes it even easier for technical and non-technical users to simplify preparation of structured and unstructured data.
Reusable Process Templates now empower all types of users to share and distribute analytic workflows. Our ongoing commitment to creating a foundation of openness and flexibility continually elevates the Statistica customer experience—and strengthens Dell’s credibility as a trusted, innovative analytics partner. So, does our support for heterogeneous environments as customers want to run any analytics on any data, anywhere, to drive better decisions across their organizations.
In contrast, many legacy providers in Dresner’s study have been slow to adopt open source technologies and self-service analytics capabilities. Some well-known names ranked below the overall study sample. This echoes the growing dissatisfaction customers often voice about how status quo BI and analytics platforms don’t measure up in keeping pace with rapidly changing business demands.
As the BI and analytics market continues to heat up, organizations around the world will seek partners that make the right investments—both in advancing their own platforms and demonstrating strong understanding of what customers are trying to achieve. To accomplish this, you need to fuse outstanding technical capabilities and strong business acumen.
Over the years, Dell Statistica has built a strong base of business knowledge by supporting top pharmaceutical, manufacturing, healthcare, financial services and retail organizations worldwide. In this video, you can hear from Tim Alosi about pharmaceutical manufacturer Sanofi’s business-led strategy that uses Statistica to reduce the cost of service delivery by harmonizing data technology with business practices.
Deep insights and analytics have helped Sanofi and other customers improve quality control, increase market intelligence, reduce supply chain risk, elevate customer experiences…the list goes on and on.
That’s why it was no surprise when Gartner predicted that by 2018, more than half of large organizations globally will compete using advanced analytics and proprietary algorithms. Our customers already are widening their competitive edge with advanced capabilities and unique functionality, such as new edge scoring to address nearly any IoT analytics use case.
Much of what makes Dell Statistica stand out in a sea of solutions is this unwavering focus on making analytics more accessible to more people. The opportunity to democratize analytics remains a guiding force and key competitive differentiator in our business approach, product development and customer partnerships.
What’s your plan for pushing ahead of the pack this year? Connect with me at david.sweenor@software.dell.com to share your thoughts.
Predictive, advanced, statistical analysis all these terms seem to be inter-changeable. Do they mean the same thing? How can they be distinguished from one another and does it really matter, or more precisely, when does it matter?
Perhaps advanced analytics simply represents the next evolutionary step in insight and taking analysis beyond the basic and requisite reporting that BI and visualization has delivered for the past couple of decades. Perhaps the best way to review the value of these terms or designations is to consider them from various viewpoints.
For the data scientist, these terms may have mathematical nuances that need exploring and understanding. Data scientists view these terms with mathematical implications that the non-data scientist may not fully appreciate. The importance here for the business persona involves proper communications with the data scientists. We need the proper understanding of how to describe the problem or the question we want tackled by the data scientist to ensure that our business explanations are preserved in translation.
On the other hand, when the data scientists share their finding, the business user that appreciates the various nuances of the math or methodology behind these terms better understands the results being presented.
During the tweet up, "#ThinkChat - Advanced Analytics - What's in your future?", we had tweet opinions shared by data scientists as well as VPs weighing in on what these terms mean to them. Perhaps worthy of a couple of highlights?
Data scientist, @angela_W says “Advanced Analytics is analytics on a consistent basis to manage the company off reality instead of someone's gut feelings. #ThinkChat"
VP of Market and Strategy, @ShawnRogers said “Advanced analytics is the next level of sophistication beyond BI. taking us to predictive and prescribed insights. #ThinkChat”
Let me know what these terms mean to you and whether or not you find the distinction worthy of articulating.
Looking for more from @JoSchloss? Take a look at the virtual panel discussion – "Discussion with a Data Scientist"
About Joanna Schloss
Joanna Schloss is a subject matter expert in the Dell Center of Excellence specializing in data and information management. Her areas of expertise include big data analytics, business intelligence, business analytics, and data warehousing.
View all posts by Joanna Schloss |
The onslaught of big data is well-documented, and the consensus is that the technology that has enabled its rapid proliferation has far outpaced the ability of human study (and matriculation rates) to keep up with the resulting need for analysis.
Big data technology has impressed many businesses, stimulating their appetites for building the future now and allowing them for the first time seriously to consider turning former data-driven pipe dreams into attainable business goals. So, the excitement has grown right along with the technology's promises, and cross-industry demand for data scientists has risen dramatically.
As highly trained and dedicated professionals specializing in the science behind data, data scientists (a.k.a., “quants” or "data analysts") were being snatched up as the most obvious choices to satisfy these new and expanding data needs. Of course, the law of supply and demand kicked in, so these data scientists became very expensive to keep and maintain. Now the perfect, fully rounded, platform-agnostic data scientists—if the perfect ones ever existed at all—are nowhere to be found in the open market. Searching for one is like looking for a unicorn: it would be magical to find one, but good luck with that.
Question: What is a self-respecting business to do when faced with the rising costs of resources that are deemed necessary to take revenue to new heights?
Answer: Find a way to achieve the same results through alternate, possibly less expensive means, without waiting for the future crop of data scientists to graduate from college, despite the fact that more and more colleges and universities have responded to the skills gap with the establishment of data science degree tracks.
Everyone was looking for unicorns. What they found was better.
Not so long ago, in editorials written not so far away, they were called “accidental analysts,” those who would use and manipulate data without benefit of substantial, formal statistical and algorithmic training. Today the popular name is “citizen data scientists.” Whatever you call them, the rise of such data handlers is not by accident, nor is it by design. It is simply by necessity.
Think of it like your cable or satellite bill—why would you want to pay for a pre-arranged subscription package that contains much more than you need? The lack of unicorns has brought businesses to the realization that maybe they don’t need one or two full-blown data scientists who can do EVERYTHING. Instead, they can assemble citizen data scientists who can do what is NECESSARY.
In a recent article, Shawn Rogers, chief research officer of Dell Statistica, rightly observes the economics of the situation: “Not every company can afford a data scientist, which is a big reason why citizen data scientists will become a big part of the data ecosystem as it evolves.”
In the same article, Innovation Enterprise’s Laura Denham states pragmatically, “Everyone in the organization needs to be able to leverage the data to some degree, and it cannot simply be left to one highly trained individual sitting at the top of the firm dishing out insights.”
"Why not?" you may ask. Because insights would take too long! Anymore, data analytics technology has increased user expectations for rapid turnarounds in data processing as well as in decision-making. The ever-shortening patience of customers and employees alike simply won’t tolerate the turnaround times associated with traditional, centralized decision-making models. (Unless, of course, there is an assurance of more accurate results, as noted in this video chat between Rogers and Dell’s Joanna Schloss.)
But who would be the data processors and decision makers in such a decentralized scenario? Generally, that would be the line-of-business (LOB) users who are working with data collected at critical process points. If only these people could engage effectively with the data—especially in real time—then they could provide quick decisions that improve quality, maintain efficiency and impress customers. These are the citizen data scientists, and there are probably many such potential analytics users in your own organization.
How is it even possible to custom-train your own substitute unicorns?
It is possible because the main ingredients for successful data analysis include curiosity and creativity, not technical expertise. For instance, Robert Murphy, managing partner at Movéo, suggests in his Webbiquity guest blog that the top six characteristics necessary for success in a data-driven marketing world include creativity, curiosity, communication skills, a knack for strategy, a desire for continuous learning and statistical/technical expertise. Notably, he listed the technical expertise last.
Why is this? As author Simon Sinek says, "You don't hire for skills, you hire for attitude. You can always teach skills." Analysts come from all walks and disciplines; the technology is teachable, so anyone with the right aptitude and attitude is trainable. Simply put, you can’t teach curiosity or creativity as easily as you can teach enough technical expertise for your people to be effective with data analysis and model building, specialized for your business needs and performing in concert with other citizen data scientists.
To support this burgeoning market, Dell has been retooling its Statistica predictive analytics platform to make it easier for citizen data scientists to perform their duties. In the latest 13.1 release (to be generally available in mid-June 2016), citizen data scientists will find it easier than ever to build and reuse workflows, configure in-database processing with three simple steps, compare and deploy advanced models, conduct visual analyses and drill-downs, and find patterns through new network analytics. With Statistica 13.1, Dell encourages its customers’ citizen data scientists to apply data science to the most important questions in their organizations, improving the speed and relevance of data science projects.
To see some demonstrations how Statistica 13.1 will be a boon for citizen data scientists, register now for our free June 21 webcast, “A Day in the Life of a Citizen Data Scientist.”
By 2020, some analysts predict there will be roughly 30 billion connected IoT devices – including refrigerators, racecars, robots and many other interesting items. From connected thermostats, coffee pots, medical devices, lighting, audio speakers, TVs, factories, buildings and remote controls nearly everything will have an IP address capable of transmitting data for analysis.
So, if one major retailer already generates 2.5 petabytes of data in an hour of transaction data – more than 167 times the amount of data stored in the Library of Congress – what will become of the data deluge? Should we simply wait for all of the yottabytes (1 trillion gigabytes) to show up in Bunblehive? Or maybe wait for data science unicorns and PhD statisticians to analyze and make sense of the ever-growing mountain data?
Of course not. Those who wait too long may not be in business tomorrow.
Emerging trends
These four key trends are disrupting entire industries, and challenging the status quo:
To address these challenges, Tim Alosi of Sanofi stated that one of the big benefits will help the “compression of time[1]” and allow organizations to act on the data sooner.
How Statistica 13.1 helps your organization
The latest version of Statistica provides your organization with several new features so you can more easily adapt to these trends. We’ll talk a bit more specifically about Statistica 13.1 in subsequent blogs. But, the key capabilities and enhancements of this newest version include:
Interested in learning more? Register now for our May 12th webcast about Statistica 13.1. to gain a deeper understanding of how this release will benefit your organization.
[1] IoT Analytics And Dell Statistica Deliver Time Compression. Gill Press, Forbes. April 15, 2016.
About David Sweenor
From requirements to coding, reporting to analytics, I enjoy leading change and challenging the status quo. Over 15 years of experience spanning the analytics spectrum including semiconductor yield characterization, enterprise data warehousing, reporting/analytics, IT program management, as well as product marketing and competitive intelligence. Currently leading Analytics Product Marketing for the Dell Software Group.
View all posts by David Sweenor |
By Danny W. Stout, Ph.D.
Earlier this month, Dell Statistica was honored to be recognized by Gartner in the “Leaders” quadrant of its Magic Quadrant for Advanced Analytics Platform (February, 2016), moving up from last year’s “Challenger” position. The report sees this as the fastest-growing segment of the analytics market and expects that by 2020, predictive and prescriptive analytics will attract 40 percent of enterprises' net new investment in business intelligence and analytics.
Today advanced analytics platforms are an essential tool for business analysts, statisticians, and data scientists. Gartner bases its rankings on "completeness of vision" and "ability to execute" focusing on companies with a strong track record in the market and most likely to influence the market’s growth and direction. In this vein, Dell Statistica has a 30-year track record in the industry, enabling organizations to realize the full potential of their data in order to predict future trends, optimize business and manufacturing processes, identify new customers and sales opportunities, explore “what-if” scenarios, and reduce the occurrence of fraud and other business risks.
The Statistica platform is a very easy-to-use solution that requires no coding and integrates seamlessly with most database platforms or directly with Hadoop, and helps organizations simplify the process by which they deploy predictive models directly to data sources at the edge, inside the firewall, in the cloud, and in partner ecosystems. Customers like the fact that while no coding is needed, it is possible to code anything within Statistica using Visual Basic, an industry standard language. This allows users to customize functionality required for their business, giving them the best of both worlds - a platform where coding is not necessary, but available if needed.
Over the past year, Dell has delivered additional innovative functionality to give customers feature updates in the Statistica product line to improve sales and marketing strategies. Gartner acknowledged the hard work and praised in particular the platform’s new strategic focus on the Internet of Things (IoT). You can read more of the report here.
Discussing how data analytics can enable healthcare professionals to provide a higher level of care, Joanna offers an overview in Building Better Healthcare, where she outline how professionals can apply this tool to drive innovation without losing value.
Sad to see 2015 trends slip into the past? What's new in your world? Mix together some Statistica users, talking heads and subject matter experts, combine with a healthy dose of the analytics community on Twitter, and it turns out that everyone’s got something to share! Enjoy this recap of our monthly #TweetChat discussion.
Using the 2015 Forrester Wave report as a starting point to address the widespread applicability of advanced analytics, author Cassandra Ballentine taps our own Shawn Rogers (among others) for some additional insights. Rogers emphasizes that the application of advanced analytics is best driven by business needs rather than technological capabilities. Company size and data volume are not nearly as important as understanding data so that it can be used to improve the way you do business.
John reflects on the journey that Statistica has taken since its inception with StatSoft in 1984 through its aggressive accomplishments with Dell in the past two years. With its latest industry recognition as a market leader, he anticipates the real journey is just beginning and explains why Statistica is up to the task.
Touching on key analytics lessons learned from Super Bowl 50, Shawn discusses the importance of getting a jump on the competition, improving current processes and changing organizational culture. In this CMSWire article, he outlines five lessons about winning with analytics that businesses can learn from how Denver went about winning this year’s big game.
When Dell acquired StatSoft and its Statistica advanced analytics platform back in early 2014, we knew it was only the start of a journey toward something bigger. Certainly, StatSoft had every reason to be proud of its accomplishments up to that point. The company and its leaders spent 30 tireless years building a loyal user base, and earning a reputation for world-class customer service and support that exceeded anything I’d seen in all my years in the industry. But we still felt there was room for more - more investment, more functionality, and more innovation. In other words, we felt there was room for true market leadership.
That’s why I was so proud last week when Dell was positioned as a “Leader” in the 2016 Gartner Magic Quadrant for Advanced Analytics Platforms. For those of you not familiar with Gartner Magic Quadrants, they assess vendors on their ability to execute and the completeness of their vision, and provide the industry with an important and objective tool that organizations can use to help evaluate the advanced analytics capabilities of vendors worldwide. I highly recommend reading not only this Magic Quadrant, but others with relevance to key areas of your business.
On a personal level, I can’t help but feel as though this placement represents an inflection point of sorts for Statistica – an opportunity to not only look back at all we’ve accomplished as a business, but also to look forward to the many innovations we still have in store for customers.
By committing to and never wavering from an aggressive development roadmap, in less than two years, we’ve transformed Dell Statistica from a solution that meets the needs of the top Ph. D.’s and data scientists, into one that now also meets the needs of the everyday citizen data scientists that have become the true driving force behind the use of analytics for so many companies. We did this not only by greatly enriching Statistica’s data visualization, visual discovery and dashboarding capabilities, but also by delivering a completely revamped and modernized GUI that prioritizes ease of use and visual appeal.
As proud as we are of those enhancements, technology leadership is about more than just having all the requisite bells and whistles customers want today. It’s about charting a course that prepares them to deal with the challenges of tomorrow. And there’s no greater analytics challenge staring customers in the face than the explosive growth of IoT infrastructures and the edge devices of which they’re comprised. That’s why we moved aggressively to build out our new Native Distributed Analytics Architecture (NDAA) capability, which enables Statistica users to push predictive algorithms and scoring functionality directly to the source of data. This eliminates the time and expense required to transport data to a centralized location, and allows for immediate action to be taken in response to insights. Leadership means coming up with a new approach to address a new challenge, and with NDAA, we’ve developed today a capability that no organization will be able to live without tomorrow.
I noted earlier that when we initially acquired StatSoft, we felt like we were at the start of journey to something bigger. With all we’ve accomplished in the two years since, it might seem as though that journey is now complete. But in fact, the opposite is true. We’re still only getting started. We just completed year one of an ambitious three-year plan to deliver continued innovation in advanced analytics. In months and years ahead, we’ll continue delivering aggressive cycles of product updates to our customers, allowing them to progress and scale at their own speed while maintaining a proactive approach to tackling the challenges of tomorrow.
At Dell, we don’t take leadership lightly. So, while we’re extremely proud to be recognized as a leader today, you don’t need predictive analytics to know how determined we’ll be to maintain that leadership well into tomorrow.
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner's research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
Source: Gartner, “Magic Quadrant for Advanced Analytics Platforms,” February 2016
For Dell Statistica, a great year just got even better. Earlier this morning, Dell was recognized as a technology leader in Advanced and Predictive Analytics for 2015 in the inaugural Dresner Advisory Services Technology Innovation Awards, a new awards program that recognizes the top vendors across the company’s Wisdom of CrowdsÒ series of research covering numerous thematic areas.
If you’re not familiar with Dresner Advisory Services and its chief research officer, Howard Dresner, well, you should be, especially if you’re serious about keeping up with all things business intelligence and analytics. Dresner Advisory Services is one of the premier providers of truly independent, third-party research and analysis. Its findings and commentaries are not driven by research sponsors, but by data and input collected from real-world technology users.
So, it goes without saying (but I’m going to say it anyway) that we’re honored to have Dell recognized as a Technology Leader for 2015 in the area of advanced and predictive analytics by such a trusted and respected authority. The expert label gets thrown around a lot these days, but in the case of Howard Dresner, it’s absolutely warranted, and we’re thrilled to have earned the recognition of his firm.
Now, Statistica has been on the receiving end of more than its fair share of awards and recognition over the years, but if it seems like we’re more excited than usual about this bit of recognition, there’s good reason. As 2015 draws to a close, it’s a great time to reflect back on all that we’ve accomplished since welcoming Statistica into the Dell family. And in doing so, I can’t help but feel as though this recognition today serves as a validation of sorts for all the hard work we’ve completed, and as motivation for all the hard work still to come.
At the time of its acquisition by Dell in the spring of 2014, StatSoft was a company with a 30-year track record of success and loyal user base for its Statistica advanced analytics software numbering in the millions. But we nonetheless had our work cut out for us, as the advanced analytics market and the needs of customers was rapidly evolving, and continues to do so. One of the primary items on our immediate technology to-do list for Statistica was to deliver enriched data visualization, visual discovery, and dashboarding capabilities. We did just that earlier this year with the introduction of the Statistica Interactive Visualization and Dashboard Engine. We also heard loud and clear that our customers wanted more visual appeal and even greater ease-of-use, and we responded by introducing a completely revamped and modernized GUI this year at Dell World as part of the launch of Statistica 13.
But you don’t get to be a technology leader without operating on the leading edge, and we’re doing just that with our focus on what we call Native Distributed Analytics Architecture (NDAA), also introduced in Statistica 13. With NDAA, Statistica users can push predictive algorithms and scoring functionality directly to the source of data, allowing companies to take advantage of the compute power on that system while eliminating the time and expense required to transport data to a central repository. In other words, instead of pushing data to the analytics, we’re enabling customers to push analytics to the data. This concept of “analytics at the edge” is already achieving great traction with customers, and considering the explosive growth of IoT environments happening as we speak, we fully expect NDAA will soon become a must-have capability. And we fully expect Dell Statistica to lead the way in delivering it.
Though we’ve come a long way in a short period of time, we’re really just getting started. In 2016, not only will we continue to enhance and enrich our first-to-market NDAA capabilities, but we’ll continue our emphasis on delivering vertical-specific packages designed for the specific needs of companies in industries such as healthcare, pharmaceuticals, manufacturing, and financials. And in keeping with Dell’s heritage, and with StatSoft’s, we will continue to focus on democratizing and making advanced analytics available to the masses. We’re already seeing a new breed of non-technical analytic users cropping up throughout organizations. These citizen data scientists will play an enormous role in the continued growth of advanced analytics, and we’re committed to helping them drive innovation for their companies.
In other words, as great as 2015 was, and as happy as we are to have ended it on such as great note courtesy of the Dresner Technology Innovation Award, we’re looking forward to even bigger and better things in 2016. And beyond.
Emphasizing the importance of investing in big data projects, Joanna describes in her latest CIO Review article (page 85) how organizations can foster IT/business alignment, develop in-house training programs and find tools that reduce complexity while managing and analyzing all data.
Mix together some Statistica users and subject matter experts, combine with a healthy dose of the Twittersphere's analytics community, and it turns out that everyone’s got something to share! Enjoy this recap of our monthly #ThinkChat discussion that addressed ten questions on how the Internet of Things is impacting the manufacturing industry.
Statistica is ready to handle the IoT, but do small and medium businesses recognize the potential for themselves? With an eye toward the practical application of a Dell Gateway, guest blogger McCabe offers this analysis in her recent Dell World follow-up.
Nature abhors a vacuuum. So does data analytics. The whole idea behind collecting and analyzing data is to answer questions within some kind of context and, thus, enable decisions to be made--hopefully, better decisions than those made without the data.
So, it only stands to reason that the more relevant data you collect, the better analysis and decisioning you can make--as long as you don't succumb to "analysis paralysis," of course. To this end, successful outcomes require data collection and preparation to go hand-in-hand with your analytics efforts.
So, how do you collect more and better data that is relevant to your context? For starters, you might expand the number of data collection points within your current monitoring systems. You could also arrange to share or purchase data from third-party sources. Or you can re-visit previously untapped data archives that you already have on-site. Any or all of these solutions might be reasonable for your situation, and executing them over time can result in some convoluted data environments and workflows that would make Rube Goldberg proud, especially in a siloed enterprise. And it is with such environments in mind that effective data integration becomes extremely important to maintain any semblance of efficiency.
Subscribers to our Statistica newsletter (yes, you can subscribe for free) already got a tip for an effective data integration solution, because the October/November issue contains a link to a helpful and detailed how-to article highlighting Statistica's new connection with Dell Toad: “The Smart Data Analyst’s Tool Set.” Authors Robert Pound and Scott Burk explain that analytics follows the age-old 20/80 rule that pervades all human activity: data analysis is only 20 percent of your work, while the upstream data preparation--the aforementioned collection, integration and cleansing--consumes the other 80 percent. The key to success here is the comprehensive capability of Toad Data Point and Toad Intelligence Central (TIC) which can now be accessed easily from within the enterprise editions of Statistica 12.7 and 13.0.
The last thing you want is for your analytics projects to be overwhelmed or undermined by the complexity of so many disparate data sources. Of course, you need those sources in order to escape the pull of the data vacuuum and make the most of your decision-making context, so you probably shouldn’t ignore them. Thankfully, Statistica and Toad combine to make the process simple and relevant.
Read the Oct/Nov Statistica Newsletter >
About Paul Hiller
Paul Hiller is a marketing communications analyst at Dell Software. He enjoys bringing order out of chaos.
View all posts by Paul Hiller |
Probably every IT department has at least one cynic who believes that every software maker touts every new release as something earth-shattering. After all, why give software a new number if it doesn’t represent a quantum leap of some kind, right? However, it is arguably true that some releases may disappoint the masses while others may justify their sequential numerations. So, skepticism may be a healthy way of self-regulating one’s expectations.
How does this apply to Statistica 13?
Having said all that, you probably expect that I will now claim the new Statistica 13 really is earth-shattering (it is!) and that you should simply take my word for it (you should!) There actually are specific capabilities within this release that make touting its merits a very easy assignment. However, “earth-shattering” remains a subjective term, so I should not be so crass as to insist you take my word for anything.
Instead, I will gladly let others make that case for me, because this Statistica release is very impressive and people are taking notice. Our newsletter subscribers already received a headful of headlines about Statistica 13, big data, and the Internet of Things (IoT) generated from our recent Dell World event in Austin, TX. Maybe you've run across these headlines yourself in other venues:
On top of Dell’s own press release, these six articles are but a drop in the bucket of media coverage, but I can tell you that what got journalists and analysts really excited about the Statistica 13 rollout is our software’s application of Native Distributed Analytics (NDA), with which Statistica saves time and effort by pushing algorithms and scoring functionality into your databases, basically analyzing your data — even big data — right where it lives. Statistica 13 distributes analytics anywhere on any platform. When it comes to dealing with streaming data and transfer limitations, NDA will fast become a busy analyst’s best friend.
Meanwhile, you just know there other enhancements in Statistica 13 that will make the user experience more enjoyable and productive with respect to data visualization, workspace GUIs, and more. After all, we had to pack in enough newness to justify that new number 13, right?
Today is a good day to check out Statistica 13 to see what it can do for you and your business. Also, be sure to subscribe to the Statistica newsletter to keep abreast of our latest product info and thought leadership.
We've all been to college at one time or another. Some of you reading this post are still in school even now. And the majority of us are probably still paying off student loans.
Speaking of college costs, maybe you have already learned about Dell Statistica's response to students in need. Our answer: FREE academic software!
Major Costs Add Up at School
Ponder your college years for a moment. Good times and challenging courses. But let’s focus on the struggle of the whole college experience ROI. What are your top complaints in this regard? If they relate to costs, you are in broad company. A nationwide campusgrotto.com survey of higher education students reveals a list of popular complaints, with a measurable percent stemming from costs:
Okay, we can't help you with the cafeteria food, but you'll notice the other complaints are indeed about costs.
Additionally, a plurality (39%) of respondents to Princeton Review's recent "College Hopes & Worries Survey" said their biggest concern is the level of debt incurred to pay for a degree.
It comes as no surprise that everything at college costs more money than we like, and it all adds up. Consider textbooks alone, the bane of every undergrad out there. Costs vary greatly from one major to the next, but assuming new book purchases are required, a study based at University of Virginia indicates that a statistics major is neither the most nor least expensive when it comes to textbooks. However, the study did find the average statistics textbook costs about $110, and students must buy multiple textbooks throughout that major's curriculum. The most expensive statistics book topped out at $342.
And, as if that weren’t enough…students in the data sciences get to tack on the cost of basic analytics software, too. It's like buying a virtual textbook on top of the physical textbooks.
What is the skills gap?
Meanwhile, though it may vary from industry to industry, the data scientist skills gap is real. Even as long ago as 2011 McKinsey & Company was already reporting that there will be a shortage of talent necessary for organizations to take advantage of big data. Barring some kind of change in the human resources supply chain, they predicted by 2018 “the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.” This is great news for students looking to break into this career path.
Change that Matters
So, our free academic program in North America is the kind of “change” we can apply readily to impact that human resources pipeline at the university level. It may not sound like much, but remember that every little bit helps when we are talking about reducing the financial burden of students seeking a strong foundation with skills-based training and key software tools in order to increase their value in the competitive data science field.
Think about it: The world needs more statistics and data science graduates to handle the deluge of big data challenges that are developing in every industry. Would the cost of just one more textbook—or, in this case, an analytics software package required by the professor—make or break the average student's ability to pursue the degree? Why risk it? We'll just give it away and let the chips fall where they may! If we choose to give away some software to help put more problem-solvers into the world’s workforce, then that's what we will do.
And the value of such a program? Priceless! Not only is the free academic bundle a boon to the study of analytics in North American academia, but because it will expand the pool of graduates qualified for real-life analytical pursuits across industries, the effects of this program are literally immeasurable, with potentially world-changing impact. You just never know where the next genius case study will originate. Truly, the gift that keeps on giving.
In a demonstration for CRNtv, Danny Stout shows the simplicity of using Statistica to reduce customer churn—that is, to predict and reduce the likelihood of customers leaving a company's consumer base—while cutting costs and building revenue. Danny further describes the value of Dell's end-to-end solution that comprises hardware, software, and services.
John Thompson recently described to Datanami how the internal data migration to Statistica from “a legacy software provider in North Carolina” has quickly reduced non-standard KPIs by 50 percent and synchronized 99 percent of incoming information with existing records. Impressive!
Schools and teachers have long scrambled to keep up with the progression of educational tools. But without adequate support, pushing technology into the classroom becomes a source of frustration for teachers and a disservice to students. Joanna Schloss addresses the recent report Dell published with THE Journal about the transformative effect predictive analytics and data management can have on K-12 education.
This Halloween, perhaps the creature you fear most is the 800-pound monster in your I.T. department commonly referred to as “change.” However, when it comes to migrating from one data analytics system to another, Dell is an experienced monster-killer. Dave describes how we reduced the fear of change during our own internal migration, so now the only scary thing is the thought of going back to that legacy system.
I love trick-or-treaters. As soon as dusk hits, our neighborhood gets swarms of cute little kids in adorable costumes, delighting us with their smiles as we drop a few pieces of candy into their pumpkin-shaped pails. As the night goes on, however, the kids get older,
Photo Credit: Steven Depolo Licensed under CC BY 2.0
more demanding and less creative with their costumes – you get the teenagers wearing pajamas asking for an extra handful and then (sometime around 10PM or later) you get very tall “kids” in street clothes wearing Friday the 13th masks banging on your door, demanding your king-size Snickers bars.
Do I tell them that they may have outgrown the trick-or-treating stage? Nah. I toss a few candy bars through the door and hope they’ll move along. Someone will eventually tell them. It just won’t be me.
Are you dressing up your analytics tool for non-analytics tasks?
In my last two posts in this platform migration series, I talked about how change can be a scary thing and how the process involved can seem just as frightening. But trust me, some of the technology challenges we faced during our own migration were also pretty disturbing.
Much like those king-size “kids,” at Dell we discovered that we had outgrown a well-known legacy analytics platform. We had also just acquired our advanced solution, Statistica – an easier-to-use analytics platform – and needed to be able to tell prospects that we use the product we are promoting. So we embarked on a great migration from a very expensive and dated analytics platform to Statistica.
During the migration, our team found that a number of analytics users at Dell were using the old product to manage and manipulate data before analyzing it. Sure, it could do the job but it was a really expensive way to move data around. And with all of the legacy code that was required to push data, it was unwieldy and unmanageable.
Unlike those grown-up trick-or-treaters, we quickly got out of denial and decided to do something about it.
How using the wrong tool for the job can haunt you
Using the wrong tool for the job may not sound like a big deal, especially if it gets the job done all the same. But in our case, it wasn’t just about using a tool with extraordinary analytics capabilities for ordinary jobs — it was also costing us licenses that didn’t need to be tied up on data management and data manipulation tasks. We couldn’t really blame our users for doing what they were likely taught by other Dell users at the time. But, it was an extremely expensive way to perform relatively common functions. And as an enterprise software company, we should know better.
So how did we overcome our fear of the unknown and commit to something that kind of scared us? We had to separate analytics from data management at the software level and at the organizational level. We moved data management and manipulation to our Toad Data Point solution, and analytics and modeling to Statistica. With each team empowered with the right tools, we now have data integration experts focused on data management and analytics experts focused on analytics. Sure, there were some awkward moments during the transition. But once you figure out what really works for you, life is simply easier ― a lesson those teenage trick-or-treaters will learn soon enough.
New e-book: The Great Analytics Migration
If you’re using an expensive analytics software product dressed up in a data preparation costume or you’re facing a migration project of your own, read our e-book Statistica: The Great Analytics Migration. You’ll discover more information on how we successfully moved hundreds of users to a new platform in a matter of months, provided everyone with the right tools for the job and saved a ton in licensing fees. Follow our lead, and you can do it, too.
Each new release of Statistica builds upon our basic premise: Embedding analytics everywhere is the best route to better decision making. With Statistica 13, which we officially released in September and formally showcased to the world last week on the grand stage of Dell World, we’ve made it even easier to run any analytics on any data anywhere with new tools, deeper integrations and cutting-edge capabilities that push predictive algorithm model-building and scoring directly into the data source.
Even more powerful analytics
The latest improvements fit nicely into one of two buckets: General enhancements or new analytical capabilities. In the first bucket, the revamped GUI aligns with the most current Windows products to further our long-standing compliance with Windows standards. As a result, Statistica is more intuitive and easier to use than ever.
Next, we’ve strengthened integration with the Statistica Interactive Visualization and Dashboard engine so the entire experience of conceiving, authoring and rendering visualizations is done in Statistica. Additionally, our integrated web server ability handles visualization rendering and management, so now analytic output can be easily distributed globally for greater collaboration and information sharing.
Customer-driven enhancements
About 90 percent of our new analytical capabilities came directly from recommendations by our world-class user community. Some, like our new Statistica stability and shelf life analysis as well as web data entry, were suggestions from pharmaceutical companies that appreciate our flexibility in addressing their specialized industry requirements.
Others, such as in-database and in-Hadoop analytics strengthen our leadership in big data predictive analytics by bringing math to the data. With this enhancement, data consumers can build an analytical model in Statistica, click one button and then deploy in a Hadoop cluster. This is great for organizations that stack a lot of data in a big data environment and want to get at it with predictive analytics.
Something for everyone
For data scientists, we’ve added stepwise modeling, lasso regression and tree clustering. And a new ability to mine Chinese text broadens our international scope as Statistica now is available in 12 languages.
Now not only do we speak more languages, we’ve expanded our community of analysts with in-database processing that enables them to run correlations on full-volume data. What we’ve done is decompose elements of algorithm formulas and simulations so they can be run directly in SQL, Oracle and Teradata—really any number of OLE or ODBC databases. What this means for our customers is they can better leverage big data investments while enabling people other than data scientists to run correlations. This extends the universe of users dramatically as anyone from a summer intern to a business analyst can run very sophisticated analytics without concerns about complex data management and sampling tasks.
Analyzing data ― even big data ― right where it lives
Processing data where the data resides, whether that’s a database or a Hadoop cluster, is another important part in Dell’s ongoing evolution to make Statistica accessible to a wider audience. We took another step in that journey in Statistica 13 with new Native Distributed Analytics (NDA) capabilities that integrate with Dell Boomi to transport analytic models anywhere in the world.
Dell Boomi is Dell’s integration platform that lets organizations connect any combination of cloud and on-premises applications without software or appliances. So, we applied this incredibly cool technology to let users run analytics directly where the data actually lives. It makes perfect sense. Take a model like a neural net and sent it across the network in the size of an email with an attachment. This lightweight model then is run against the data in a SQL database and results are returned quickly and efficiently.
With Boomi, we can take analytics to the data in a highly secure, efficient manner. This capability, which is unique to Dell, is designed to make analytics more accessible and available. That’s also part of our goal in extending the work we started with open source R and Azure ML last year. We’re making strides in collective intelligence to open our platform further and enable people to bring in models from all over the world to solve the toughest business problems.
How will Statistica 13 simplify your work?
Statistica 13 deals customers a winning hand with improvements, enhancements and new integrations that illustrate our continuing focus on lean-forward technology. If you’d like to learn more about how Statistica can simplify your work, check out our upcoming webcast, What’s New in Statistica 13.
What new features in our “lucky 13” release will help your organization do more with your big data predictive analytics? Connect with me on Twitter at @johnkthompson60 to share your thoughts.
Technology is always knocking on school doors:
Click to enlarge
Schools and teachers have long scrambled to keep up with the progression of tools for educating students. The knowledge gap is a given, when you consider that students outnumber teachers dozens-to-one and adopt new technologies quickly.
Students are taking tests on tablets, studying (and collaborating) with peers online and answering questions in real time on documents in the cloud. Teachers are more accustomed to working with pen and paper, but the facts on the ground are moving them to mobile devices just so they can keep up with students.
Technology is good. Support is better.
Students learn new technologies at recess almost daily. How do teachers make time for their own technology training? You can push technology into the classroom, but until there is adequate support, it becomes a source of frustration for the teachers and a disservice to the students.
Teachers are aware of the gap between the tools and technology on one hand and the support they need to use them effectively on the other. To create the infographic Powering Student Learning with Data Analytics in K-12, THE Journal surveyed decision makers on the use of data in K-12 education and quantified that gap:
Predictive analytics in K-12 education
Together with THE Journal, we’ve published a report called Game Changer: How Predictive Analytics is Transforming K-12 Education. Read it for more insights into using data analytics and data management in education.
How do you think we can bridge the technology gap? Let me know in the comments below.
In this post, I share highlights of a fascinating conversation with Dr. Steven M. Melemis, a long-time user of Dell Statistica and recognized expert in the field of addiction medicine. After earning a Ph.D. in statistics, Melemis went back to school for a medical degree and has merged his two academic pursuits to improve patient care and streamline emergency room operations.
Q: When did you first recognize the data/healthcare connection?
Given my background in statistics, I saw a long time ago that data drives almost all of our decisions. So if we can better analyze data, we can provide better patient care. The challenge in healthcare, however, is that many physicians suffer from a fear of statistics, especially now when the massive volumes of data can be intimidating.
Q: So, what’s the best way to help physicians conquer that fear?
While I enjoy analyzing complex data sets, I realize not everyone wants to dig that deeply. Still, you can learn a lot from simple data visualization. I tell people to look at the charts—they’ll tell you what you need to know. Statistica’s quick-start data mining recipes, analytic workflow templates and out-of-the-box analysis capabilities make it much easier to gain meaningful insights.
Q: When did you first start using predictive analytics software?
In my early academic career, I used statistical software that ran on mainframes. During my post-doctoral fellowship in the 1990s, however, I wanted to find something I could use on a PC. That’s when I first found Statistica—and I’ve been using it ever since.
What I like best about Statistica is it works across the entire spectrum of predictive analytics. While it features sophisticated tools, it’s also suitable for people who just want to draw a few graphs and really look at their data in different ways. When data is more accessible, you can put capabilities in the hands of the people doing the actual work and empower them to see things they didn’t see before.
Q: How has predictive analytics transformed alcohol withdrawal treatment?
It has dramatically improved how patients are monitored in emergency rooms and rehab settings. For years, a classic tool called the Clinical Institute Withdrawal Assessment (CIWA), was used to assess whether patients needed treatment for alcohol withdrawal. The labor-intensive CIWA required a nurse to monitor a patient by measuring 10 different variables every hour. The problem: Some people didn’t get the entire test because there simply wasn’t enough time or resources, and occasionally, patients fell through the cracks.
I knew there was valuable insight buried in the data, so I digitized a small core of CIWA measurements for 100 patients and fed it through Statistica—and one variable stood out! Determining if a person needs withdrawal treatment can be accurately predicted 75 percent of the time by simply determining if the person is sweating or not. Adding one more variable—perceptual disturbances—and the predictive accuracy jumped to 90 percent.
I took that information to the hospitals in Toronto, where I live and work, and also shared it with leading rehab programs. The bottom line: Treating patients for alcohol withdrawal has become faster and more efficient. All from a simple yet powerful data analysis performed by Statistica.
Q: How has predictive analytics changed your approach to addiction recovery?
My approach is best summed up in an article I wrote recently for the Yale Journal of Biology and Medicine where I outline relapse prevention and the Five Rules of Recovery. Predictive analytics have shown me that people can greatly improve their chances of recovery if the follow a few simple guidelines.
By empowering physicians of all kinds to take advantage of valuable data analytics, we can make major strides in personalizing healthcare and treatment plans while lowering healthcare costs and elevating patient care delivery.
And, the best part is that you don’t need to have a Ph.D. in statistics. All you need is an easy- to-use tool like Statistica to simplify big data analysis and improve critical decision making.
Subscribers to Statistica's monthly newsletter saw a headline last week mentioning Grammy-winner John Mayer alongside a reference to Statistica 13. Okay, I have to admit, when Dell first announced that Mayer would be at Dell World, I honestly had no idea who he was or why I should care. It turns out that he is a famous guitarist, if you believe everything you read on the Internet (!), and he will be performing at Dell World, the company's annual, global product solutions showcase.
I don't say any of this to belittle Mayer at all, but merely to highlight for you my own ignorance. Just because I had never heard of Mayer doesn't mean he isn't great at what he does. In the same way, just because organizations around the planet maybe haven't yet heard of every Dell Software solution doesn't mean that we aren't the best at what we do. What it does mean is that, like Mayer, we must keep spreading the word, doing our best to get our name and products in front of wider and wider audiences, attracting the attention of competitors and champions alike.
Mayer has done well for himself in this regard, and so has Dell Software. In addition to bringing John Mayer to the Dell World stage in October, Dell Software will also officially roll out Statistica 13 at that same event, and we couldn't be more excited! Our Statistica subscribers already read some product hints in the September newsletter (you, too, can subscribe for FREE), but we are keeping the details fairly under wraps until October 20. You may rest assured that, true to Statisica's long legacy, we are working hard to expand our worldwide audience, driving our advanced analytics platform with three defining concepts:
Will Statistica ever win a Grammy? Maybe not, but Statistica does continue pulling in industry recognition from some very credible sources (for instance, here and here), and we do expect this trend to continue as more and more businesses make the switch to Statistica. It’s like the Grammys, but different.
Meanwhile, have you registered yet for Dell World or for the Dell World Software User Forum? We want you to be there LIVE for the big Statistica reveal. And maybe you can meet John Mayer, too. I hear he's pretty good.
It’s almost Halloween: the one night of the year when everyone becomes something different. As we approach an evening of haunted houses and creepy masks, I can’t help analyzing what it is that really gives us that little shiver of fear. And I think the answer is change. Because when you break it down, Halloween is really an entire holiday built around sudden transformation and the horror that can transpire.
Take, for example, your neighbors. You’re used to seeing that khaki-clad soccer dad next door waving hello as he starts his minivan each morning. But come October 31st, when he’s swinging open his front door to greet the now-horrified trick-o-treaters staring up at his Ozzy Osbourne makeup, a fake bat swinging from his mouth, there’s bound to be a few screams.
But it’s possible that if you’d always known that guy, not as a Ned Flanders clone, but simply as the wacky old rock star next door, you wouldn’t even flinch at his proclivity for joker makeup and bat snacks. It’s the epic transformation that really throws you off. Or maybe it’s just seeing a grown man pretend to eat a bat that does that. In any case, change is a frightening thing, and it can send people running for the hills.
Confronting the 800-pound monster
So, I have to admit, when I heard Dell wanted to take on a migration project to move hundreds of our employees onto a new advanced analytics platform in just six months, I couldn’t help worrying that our office might turn into its own house of horrors.
Would the normally reserved analyst in the corner cube leap out screaming that he didn’t want to lose all the functionality he’d spent years building into our legacy system? Would angry mobs take to the halls in protest?
We needed a way to not only calm everyone’s nerves, but more than that, we needed a way to get everyone excited about something new and different. So how did we do it ― and why?
Let’s start with the last part of that question. When you’re using an 800-pound monster of an analytics solution, you start to wonder if there isn’t a better way. From scary-high licensing costs, to the need for so many of our analysts to transform into coders, the prospect of maintaining the status quo began to seem more frightening than overhauling the entire system.
Meanwhile, we’d just added an advanced analytics product called Statistica to our solutions portfolio. And as we became more and more familiar with its predictive analytics capabilities, we were blown away with what it could do.
Suddenly, we didn’t just want to sell this thing, we wanted to use it ourselves. The fact was, Statistica was just a more affordable and robust analytics platform than the one that was bleeding us dry.
And while those of us who’d seen it in action were sold, we still had to sell it to our employees. So we got together and made a plan that would help us properly anticipate and alleviate everyone’s fear of change.
Making change less frightening
We first had to prove that all the work users had done with the legacy product wouldn’t be discarded. So we sat down and showed them how everything could be replicated and enhanced in Statistica ― while eliminating the need for coding. Once they saw how they’d be able to leverage the same functions and make them better while doing less work? Well, let’s just say we had everyone’s attention.
After we’d communicated the change and addressed everyone’s biggest fears, we gave people early access to Statistica. They began to see how this new tool better aligned with their tasks, and they started to shift their emotional connection away from the legacy platform.
To keep things exciting and build on our momentum, we launched a contest. This inspired teams to get up and running quickly with Statistica, and it was fun to watch everyone build real-world models with their new toolset. It didn’t take long before users were embracing Statistica.
When all was said and done, we’d successfully migrated hundreds of users onto Statistica in just a matter of months. I guess that says a lot about how cool the product is. But it also says a lot about how people can embrace change quickly, if it’s really for the best.
And seeing how happy our users are today, I know it was the right thing. I’m not saying I wasn’t scared at first myself. This was a daunting project. But with a little creativity, we took something that could’ve been a nightmare and turned it into something to celebrate.
Now that I’ve seen just how cool change can be, I’m looking forward to this Halloween. Because the scariest thing I can think of isn’t living next door to a Marilyn Manson impersonator for the night – it’s the thought of going back to our legacy system. Now that I don’t have to worry about that, I’m ready for some fun.
In our new e-book, Statistica: The Great Analytics Migration, we explain how we created a plan for managing the people, processes and technologies involved in making our analytics platform switch. To learn more about how we migrated hundreds of users to a powerful analytics platform that reduces costs and coding needs, check out our e-book today.
We hope that our experience will make yours a lot less spooky.
If your company’s data analytics function went to the doctor for a checkup, would it come out with one of these diagnoses?
A severe case of hyper-expectations
Data scientist deprivation syndrome
Technology deficit disorder
The advanced analytics function is barely the age of a toddler in most organizations, yet the stress is already beginning to show. You can’t fault it, really; analytics itself is changing almost as fast as the data is arriving from your customers, transactions, connected devices, industrial machinery and supply chain.
It’s like PCs in the early 1980s: We knew we needed and wanted them, but it took a while for us to figure out how to make the best use of them. Our needs changed as fast as the technology changed.
In our new e-book, Break Down the Barriers to Better Analytics, we look at how the changing face of analytics is moving faster than organizations themselves, leading to three symptoms:
Have a look at our new e-book, Break Down the Barriers to Better Analytics, for more insights into the changing face of analytics; data preparation and data blending; and the corporate- and IT-centered barriers to using analytics efficiently in your organization.
And be sure to read it before your company’s advanced analytics function goes to the doctor for a checkup.
Photo credit www.RGBStock.com Allesandro
In its long legacy, Dell Statistica has ranked very favorably for user satisfaction all over the world, as evidenced by survey results (e.g., Rexer) and customer testimonials.
One way to keep customers happy with Statistica, of course, is to provide great support! We like to share our knowledge and make it easy for customers to engage not only with our internal subject matter experts but also with each other. In fact, our newsletter subscribers were recently provided with a comprehensive list of support tools just waiting for use. If you haven't seen that list before today, you might just want to subscribe for free to our newsletter.
Keeping all this in mind, what do you suppose the following subjects have in common?
…and my personal favorite:
If you guessed these are all topics about which Statistica users have asked questions in our new User Discussion Forum, then you guessed right. Our support team monitors the forum to provide answers and feedback, and you can also engage with other registered users who have selected to receive email notifications of forum activity.
Please note that the only requirement for you to participate in the forum is to register first in the free TechCenter community. This is a different ID/password combo than is used for standard Dell Software Group website access. The community registration page is accessed via the “Join” button in the top right corner of the community header at the forum page. By joining, you gain access not only to the Statistica forums but to the broader Dell TechCenter community, as well. Welcome to Funky Town!*
Visit the Statistica User Discussion Forum >
* Yes, that's the delightful way the asker really spelled it in our forum. Apparently, Statistica brings the funky to data analysis and predictive analytics! And now, like me, you can have the catchy Lipps Inc hit Funky Town running through your brain for the rest of the day.
Photo Credit: Ernesto Andrade Licensed under CC BY 2.0
Imagine that you’re in the middle of an analytics migration project (as Dell was).
Hundreds of users and projects all over the world are in transition from your legacy analytics product to the new one (as ours were).
Everyone is heads-down-focused on following vast, detailed project plans with a jillion moving parts (as we were.)
Suddenly, out of nowhere, an opportunity to fix Something Else swoops onto the scene (as it did onto ours).
That Something Else is kind of a mess, and this would be the ideal time to fix it, but everybody around you is urging you to focus, focus, focus on migrating projects and users. The Something Else has to do with the tools and processes on either side of the analytics function. It’s not exactly the same issue as replacing your company’s advanced analytics product, but it’s closely related.
What do you do: stay focused on your original project or devote some cycles to dealing with the Something Else?
ETL, data extraction and reporting. And the timing belt.
At Dell, we were waist-deep in migration from a well-known analytics product we had used for decades (you can probably guess which one) to Statistica, a product we had recently acquired. As I posted a few weeks ago, our migration project team discovered that a lot of people were using a Ferrari to haul dirt – that is, using a powerful analytics tool just for data manipulation – so we made some organization and tool changes as part of the migration.
But the Something Else we discovered was that people were using dozens of tools for some of the main functions around analytics:
ETL (Extract Transform Load) Process Automation – Microsoft SQL Server Integration Services, Microsoft Visual Studio
Data Extraction – Microsoft SQL Server, D3’s JavaScript library, Adobe Site Catalyst
Reporting – Microsoft SQL Server Reporting Services, Microsoft Access
We had the opportunity to consolidate or replace these and stop the tool-creep, and it seemed as though we’d never have a better time to do it. Once everyone saw the inefficiency, the whole migration team wanted to deal with it, but it wasn’t part of the original plan.
It was like the timing belt story:
“Well, ma’am, your car has 90,000 miles, so we should replace the timing belt. And while we’re in there, we’ll have everything apart, so if there’s a problem with your water pump or the tensioner or the front cover gasket or the seal, that’s the best time to take care of it.”
It’s tough to bite that bullet and deal with the Something Else. But you know if you don’t deal with it and you have to go back in again later to fix it, you’ll kick yourself.
Actually, you won’t have to, because your boss will do the kicking for you.
The Great Analytics Migration – new e-book
So what did we do at Dell?
We went the extra mile and did the consolidation. It’s the kind of company we are: we can’t look at an inefficiency and not do something about it. Our companywide Business Intelligence Council ran a survey that found dozens of tools at work. The council identified seven of the top ten of those additional tools for migration to appropriate Dell technologies. We’ll get to the rest of them eventually.
Should you migrate users from other tools in the same project? We can’t tell you how to make that decision for your company, but we’ve put together an e-book, “Statistica: The Great Analytics Migration , Part 3: Technology, that tells you how we made it at Dell. Read the e-book for unique insights into how we managed our migration. We know quite a bit about it.
Just don’t ask us about your timing belt.
While I certainly appreciate Boston for its history, chowder, and marathon, it is the predictive analytics scene that keeps bringing us back year after year. I know that sounds odd, but the annual Predictive Analytics World (PAW) Boston event is a natural fit for Statistica, especially with the recent development of a predictive healthcare track.
Healthcare's connection to predictive analytics arguably extends back to the ancient Greek physician, Hippocrates of Kos, who supposedly provided the instruction, “Declare the past, diagnose the present, foretell the future.” And if that isn't data-scientist-speak, I don't know what is! Hippocrates also touted the medicinal value of food, so I have no doubt he would have prescribed Boston clam chowder for its palliative effects, though I suspect he had his fill of seafood in his time (several hundred years before the birth of Christ).
Back then, of course, the healthcare system--if it could be called such--was comparatively simple, perhaps limited primarily to individual doctor-patient relationships. That simplicity is no longer the norm. During Statistica's mere 31-year legacy, our customers have driven us to develop expertise that guides healthcare organizations through the necessary components of data management and reporting, patient analytics, insurance risk reduction and regulatory compliance. You can learn about some of our healthcare successes in our datasheets, white papers, and videos.
So, when it comes to targeted events like PAW-Healthcare in Boston, we get to be all over the place. Our newsletter readers (yes, you can subscribe for FREE) already received a short list of our PAW-Healthcare exposure, where we will be sharing our expertise face-to-face with modern-day physicians and data scientists at breakfasts, meetups, and presentations. Take a look here and then be sure to register for PAW-Healthcare yourself.
We will also maintain a presence at booth #240, so we hope to see you there the week of September 28.
You can lead a horse to water, but you can’t make it drink.
Image credit: Greg Westfall | Licensed under: CC BY 2.0
If you’re going to spend months putting that water in place by migrating to a new analytics platform, you’d better build a process for onboarding users smoothly so that they drink. Otherwise, you’ll end up with a lot of people looking like the kid in the photo and reciting the caption to you.
I mentioned in my previous post the migration project we underwent here at Dell to move off one of the world’s best-known legacy analytics products and onto Statistica, an analytics platform Dell had recently acquired. How do you onboard users throughout the project when you make a fundamental switch like that?
Where does your user onboarding process live?
Who manages user onboarding in your organization? Usually, the onboarding process lives in IT, which is where it used to reside at Dell. It wasn’t perfect, but we lived with it that way for a long time, along with three burdensome restrictions:
Finite number of licenses: It’s hard to onboard new users when you have a limit on licenses for your analytics software. We had to ask IT for more licenses, and they had to tell us none were available.
License swapping: De-activating and re-activating licenses to move them between data analysts was a drag, but as keeper of the software keys, IT had to be involved.
Doling out licenses carefully: On the rare occasions when licenses were freed up, people had to lobby IT for access to them.
In fact, it took the migration project to finally break this process, and that’s when we moved it out of IT.
The flexible licensing model of Statistica allowed us to change the focus of onboarding users from IT to self-service in the business units themselves. Then, we made an organizational change so that the Business Intelligence Center of Excellence in each business unit made and managed its own strategy for allocating access to Statistica.
That pushed the unwelcome variable (IT’s response time) out of the migration project and kicked off more-efficient onboarding for everyone. Internal customer satisfaction went up when users saw that getting access would be easier in the future than it had been with the legacy product.
We’ve written a new e-book called “Statistica: The Great Analytics Migration, Part 3: Technology.” Read it for an idea of how we at Dell handled the migration from one of the world’s best-known analytics products (you can probably guess which one) onto Statistica while allaying users’ concerns about migration and access.
If you embark on a migration project, whether for analytics or any other companywide function, you’ll need to think about making the user onboarding process palatable.
After all, the kid in the photo is cute for a minute or two, but you don’t want your co-workers looking at you like that for months on end.
“Why would you use a Ferrari to haul a load of dirt?”
Yeah. Why would you?
Photo Credit: Falk Lademann
You wouldn’t, of course, at least not knowingly. But a few months into the Great Analytics Migration I described last month, our migration team found analytics users at Dell who had been doing the equivalent for years. That’s when the question about using a Ferrari as a dump truck started making the rounds.
Better-fitting tools for data management and manipulation
The “Ferrari” was a well-known legacy analytics software product designed to run on mainframes back in the 1970s. (You can probably guess which one it is).
It happens that the product includes tools for data management and data manipulation, so our users became accustomed to using the Ferrari, with its high licensing fees and remarkable analytics capabilities, for “hauling loads of dirt;” that is, manipulating data before analyzing it.
In all fairness, most of the users were just doing what they’d learned from other users within Dell. And they weren’t ruining the Ferrari’s transmission or even scratching the paint. But as an enterprise software company, we have a line of products like Toad Data Point that cost less and are perfectly suited to the task of accessing and working with big data sources. And anyway, the entire migration project was about moving off the well-known legacy analytics software product and replacing it with Statistica, an easier–to-use analytics platform that we had acquired.
So using a Ferrari to haul a load of dirt was costing us licenses that didn’t need to be tied up on data management and data manipulation tasks. It’s an extremely expensive way to perform relatively common functions.
Better yet, as we separated analytics from data management at the software level, we also separated them at the organizational level. In the course of our migration project, we moved data management and manipulation to Toad Data Point, handled by data integration experts, and analytics and modeling to Statistica, handled by analytics professionals. That has put each team is in its respective wheelhouse.
Are you by any chance using an well-known legacy analytics software product to manipulate your data? If so, then some of your users are probably using a Ferrari to haul dirt.
That may be all right with you, but if it isn’t, have a look at “Statistica: The Great Analytics Migration , Part 3: Technology” to find out how we switched analytics platforms worldwide in a matter of months.