Image credit: Pat Herman
In the most recent issue of our Statistica Monthly Newsletter (yes, you can subscribe for free), our readers were made aware of a new Statistica user forum in our community pages. The new forum is intended to be a true user-to-user community, with discussion threads driven by the users, of the users, and for the users.
The good news is you don’t have to be a Statistica guru to participate! However, this forum does provide you the opportunity to share your best practices, seek feedback on vexing challenges, make product suggestions and expound on specific analytics and data topics that interest you. You can promote yourself by linking to relevant blogs and articles you have written in other forums, too, such as LinkedIn groups. This totally new forum will be regularly monitored by Dell experts and peers alike, so you can anticipate that your posts will be addressed even as we build our new community audience from scratch.
Why is this a big deal? Because the development of the Statistica platform itself is a response to your real-life use cases and the industry trends that affect you. And because you can improve your own knowledge base (and your personal brand) by collaborating with fellow Statistica users. You never know where the next big idea may come from. Here I will happily defer to greater minds than my own:
The sun never sets on the Statistica empire, because there are over 1 million Statistica users in dozens of countries around the globe, in industry and academia and government. As a Statistica user, you are never alone. So share the forum link with your peers, and we look forward to your participation.
For more information, subscribe to the Statistica newsletter >
Months of planning complete, hardware and software procured, associates prepped. The sails are set, and the stars are aligned to flip the switch on a major analytics platform migration. That’s what the buildup felt like when Dell was ready to start moving users from our legacy analytics platform to Statistica, an easier-to-use and lower cost solution, as the company’s analytics platform.
In a previous blog post, we covered the lessons learned from process planning. It bears repeating that the time Dell spent working through process-oriented questions positioned the organization to start moving users onto Statistica on schedule. Even still, when the actual migration is in progress, some new and perhaps surprising processes pop up.
What actually happens when a company migrates to a new analytics platform? Find out in the Dell-on-Dell case study. The e-book, “Statistica: The Great Analytics Migration,” is available for download.
During our migration, Dell realized that it was the business leads in the Centers of Excellence (CoEs) that had the best glimpse into progress, knowing how close each user or department was to migrating completely off the legacy analytics platform. The CoEs also had the insight into unexpected roles and tasks users and managers took on along the way. Let’s look at three:
Working in two platforms: Perhaps it’s not surprising that a migration of this magnitude would put added strain onto employees taking part in the migration, but there are only so many hours in a workweek. If there is an expectation on teams to add more tasks onto the daily workflow, plan deadlines accordingly.
Double checking, for a while: Since analytics are pervasive at Dell and run mission-critical business applications, to ensure the integrity of results and minimize risk during the migration, Dell ran Statistica and the legacy analytics system concurrently to make sure everything was operating as expected. It was a process that was unexpected and time consuming but necessary before the legacy analytics platform could be turned off.
Trying to align individual business groups: You’ve heard of the phrase herding cats. It’s certainly a good comparison to managing a migration in which various groups operate on their own timeline, working toward their own objectives. But success means getting all groups to completion by the overall deadline.
During the migration, we encountered some additional necessary processes to move the migration along. For instance, the team realized it was important to stop and correct inefficiencies, despite the reluctance to take any time away from moving forward. Managers also experimented with different motivation tactics, including contests. To find out more about the actual migration, download the Dell-on-Dell case study, “Statistica: The Great Analytics Migration, Part 2: Process,” which recounts ways to get all teams to stick to the deadline. Would you expect pressure in your organization to come from the business leads or IT?
There’s more to come from Dell on its Statistica migration. In part 3 of the e-book, we’ll cover all aspects related to technology components of the migration project — from architecture to tooling. In the meantime, read part 2 to get more insight into our migration process.
Subscribers to the Statistica Monthly Newsletter already received a heads-up about several events coming over the next few weeks, including the combined Statistica-Toad tech webcast on August 27, called, “The Smart Data Analyst’s Toolset.”
Maybe the Toad name doesn’t mean much yet to longtime Statistica Enterprise users, but it will. That is because Statistica—historically a very robust and IT-friendly analytics platform on its own merits—is now integrated with Toad Data Point and Toad Intelligence Central (TIC), fellow members of Dell Software’s information management portfolio.
What does Dell Toad do for you and why should you care? Well, that’s what the webcast will cover in detail, so let’s just summarize here by saying Toad opens your door to a whole new world of big data interconnectivity upstream of Statistica. Toad Data Point offers self-service data access, integration, and data preparation functionality with relational and non-relational sources. (And who wouldn’t be happy with more tools for handling data preparation, one of the leading sources of posterior pain among data analysts?) Meanwhile, on the server side, TIC seamlessly connects data, users, and files for well-governed collaboration.
Once again, this is Statistica’s way of “playing nice” with your existing IT assets and software components, a practical trait for which our platform has long been hailed by satisfied users worldwide. If you already have Dell Toad in place, now you can call Data Point and TIC directly from within Statistica Enterprise. If you do not yet have Toad in place, you really should be taking a look at Toad now!
Either way, the August 27 webcast will showcase the new Statistica-Toad connection that enables you to provision data sources across platforms like never before in order to produce a single system for data profiling, cleansing, analysis, modeling, deployment, and scoring.
A powerful connection, indeed.
Uttering the word “process” will likely send a shudder down a business or IT pro’s back, anticipating the planning, resources, timelines and deadlines all part of said process. Despite resistance, in many cases and especially when facing a major IT migration, it’s the process that ensures all stakeholders are satisfied by the result.
Dell recently migrated its entire legacy analytics platform to Statistica – and we’re hoping our experience will help other companies in their own migrations. In Part I of the Dell-on-Dell e-book about our Statistica deployment, we reviewed the steps we took to get our people – all associates that touch the analytics platform – on board at integral parts of the migration.
The fear of a major migration shouldn’t stop your organization from deploying a better solution. Learn how Dell moved to a new analytics platform in the e-book, “Statistica: The Great Analytics Migration.”
In Part II of the e-book, we’re addressing several process-orientated challenges and questions. A few important process lessons learned:
The migration “process” starts before you even realize – and no one likes surprises: Even if the executive staff or IT decision-makers share plans with all stakeholders as soon as the project is confirmed, anticipate that some savvy stakeholders already suspect a change afoot. For Dell, our associates expected a change when Dell acquired Statistica. For other companies, it could be a poor performing analytics platform or a change in executive leadership which could signal a migration. Either way, informing those involved sooner rather than later will limit the number of people caught off-guard.
Investing time in laying the groundwork is well worth the effort: Before a single action was taken, Dell pooled every available resource from the Statistica team to truly understand the platform and IT requirements. That process alone took a month. But it was valuable time spent to plan and align expectations. Better informed stakeholders could more quickly and accurately answer other process questions:
- How long with migration take?
- How many users really need to migrate?
- How fast can we be up and running?
Centers of Excellence are monumentally important: Creating Centers of Excellence (CoE) sounds like a process in of itself, doesn’t it? However, it’s these groups of stakeholders organized by business function that help the organization with a migration overall. CoE provide valuable insight on how the project can be helped by or can help each business function. At Dell, our CoE identified analytic platform users that should be part of the migration, which helped our team resource appropriately.
Process-oriented planning is tough and it’s a challenge to get all associates on board with the rigor necessary to make a migration successful. But the advantages of process planning far outweigh the perceived time savings expected to be gained by rushing through a migration. In fact, with pre-planning Dell was able to get hardware online and ready to accept users in a five week period – a task that should take 3 months!
For more insight into how Dell ticked through the process-oriented questions we faced, download the e-book, “Statistica: The Great Analytics Migration, Part II: Process.” Whether your organization is migrating 10 users or 10,000 users in a 6-week or 6-year project, our answers may help your migration process go smoothly.
Follow #ThinkChat on Twitter August 13th at 11:00 am PDT for a live conversation about how big data innovation is reshaping enterprise security!
Join Shawn Rogers (@ShawnRog), Chief Research Officer for Dell IMG, Joanna Schloss (@JoSchloss) Dell Software’s Analytics Thought Leader and John Whittaker (@Alertsource) Executive Director, Information Management Group at Dell for this month’s #ThinkChat as they discuss how big data innovation and security are creating opportunities to excel and new challenges for the enterprise.
Tweet with us about how big data represents innovation and many companies are jumping on the band wagon to be first to gain competitive advantage using new data type, insightful applications and new data frameworks while trying to adapt to new security concerns, ricks and best practices.
Join the conversation!!
The #ThinkChat Agenda Includes:
1. Is big data a challenger or enabler to enterprise security? Do today’s new solutions enable us to thrive?
2. Where do privacy and big data meet? Innovation can breed issues that go beyond security. What precautions are critical?
3. Are there best practices for non-relational data sources with regards to security?
4. Are big data solutions like Hadoop enterprise secure? Do they meet the needs of today’s business?
5. Do new big data frameworks create opportunity for better security analytics or create complexity?
6. IoT is exciting but it opens the door for greater security challenges. What are the top best practices?
7. Speed is at the heart of many security analytic scenarios, how does big data create value when speed is critical?
8. What are some Big Data factors that change the way we approach data protection?
9. How have your data protection needs changed over the past 2-3 years?
10. Is there a difference between securing and governing big data versus traditional data?
Where: Live on Twitter – Follow Hashtag #ThinkChat to get your questions answered and participate in the conversation!
When: August 13th, at 11:00 am PDT
The role of the data scientist has become a bit of a legend in the analytics industry these past few years. Many of my DBA friends have gained instant career advancement with self-appointed promotions by adding this job title to their LinkedIn profiles leading to raises and general admiration from their peers. Tom Davenport and D.J. Patil wrote about this topic back in 2012 for the Harvard Business Review and it caused everyone to take notice of this analytically driven job. The role of the data scientist has evolved these past 3 years and so has the definition.
In this week’s #ThinkChat segment Tom and I discuss how the role of data scientist is changing, where they fit in an organization and whether or not everyone who has stealthily added the title to their resume owes Tom a cut of their new found wages!!
#ThinkChat Conversation with Tom Davenport Part 7 of 7
To view other segments in the #ThinkChat series click here.
This short video is part of Shawn's #ThinkChat video series with Tom Davenport, professor of management and IT at Babson College. Here the two briefly discuss how the landscape is evolving and how a growing community of advanced analytic users and enablers are fueling change. It’s the age of the analytic amateur and the semi-pro!
John takes a look at the many ways organizations can benefit from a decentralized, collaborative approach to analytics, an approach made realistically possible for more and more companies with the advent of simple, cloud-enabled tools.
Change can be daunting, especially when it involves unfamiliar technology to accomplish daily tasks. So when an entire workforce must migrate to a new software platform after years with legacy code, what kinds of questions do they ask? And how are their fears replaced with curiosity? David Sweenor introduces the first chapter of a three-part e-book describing Dell’s own recent migration.
In his latest article published at www.HealthDataManagement.com, Dr. Hill discusses the technology revolution that will involve predictive analytics in thousands of healthcare applications and workflows, and he shares his perspective regarding the industry's various opportunities, disruptors and hurdles.
In the latest issue of Statistica Monthly News (yes, you can subscribe for free), our readers found a link to a webcast that talks all about Statistica’s new partnership with Microsoft, a relationship that produces some incredible hybrid cloud functionality for data analysis using Azure Machine Learning (ML).
We are talking about a hybrid cloud solution whose powerful functionality completely belies Azure’s namesake: a shade of bright blue often likened to that of a cloudless sky. Cloudless? Hardly. The Statistica-Microsoft partnership is all about the Cloud!
The fun story in the webcast describes how one website was running an analytics program as an API on Azure. Designed to guess ages and genders of people in photographic images, the site was expecting a few thousand submissions, but it went from zero to 1.2 million hourly visitors within just two days of going live, and up to seven million images per hour. By day six, 50.5 million users had submitted over 380 million photos! Normally, we would hear about sites crashing with such a viral overload. But this site kept humming along even when the action ramped up so dramatically, primarily because Azure scaled dynamically as intended, handling the unforeseen load like a champ.
Think about embedding this kind of cloud access and flexible scalability as a directly callable function inside Statistica—well, that just makes way too much sense, right? But that is what’s happened! Azure ML is really a development environment for creating APIs on Azure, with the intent to enable users to have machine learning in any application, whether that is a web app or a complex workflow driven by Statistica. For instance, you can host your complicated models in the cloud with Azure and run non-sensitive, big data analytics out there—a very practical time saver and money saver. Then you can bring those analyzed results back down to join perhaps more sensitive data and analytics output behind your firewall. You can learn more when you watch our “Cloud Analytics” webcast.
You might have heard that the term "big data" is over-hyped. And maybe you have heard it already slid into the “trough of disillusionment” (as far back as early 2013, if you believe everything you read on the Internet). Even assuming these assessments are true, the fact remains that the term itself remains relevant and apt for many a business person still seeking best tips and practices for developing analytics projects.
In marketing-speak, the “big data” slogan has stickiness. But for all its ubiquity—or, perhaps, because of it—the term "big data" remains something of an enigma, a source of curiosity for business leaders and data executives worldwide. That is to say, business people wanting to get into analytics still respond to that term more than others.
To be fair, the longevity of “big data” works in its favor. Currently, people search for “big data” on Google an average of 60.5K times per month, perhaps because the term seems all-encompassing and broadly descriptive, a good place to start asking questions. Meanwhile, more recent phrases—despite their own merits and relevance—are not sought out nearly as often. For instance, “internet of things” currently averages only 40.5K monthly searches, and “predictive analytics” clocks in at 9.9K. And even if you think “cloud analytics” is destined to be the Google rage someday, right now that phrase averages only 390 searches per month. (That’s not a typo: it is 390.)
This popularity is why we still like to use the “big data” moniker when talking about Statistica’s analytics prowess. Did you read our July issue of Statistica Monthly News? (Yes, you can subscribe for free.) In the sidebar list of events, our subscribers have already seen that we are offering a free Tech Webcast on July 30, “Statistica Eats Big Data for Breakfast.” This webcast will be presented by Mark Davis, the founder of Kitenga and now Distinguished Engineer at Dell Software. He will be focusing on the newer big data capabilities within Dell Statistica 12.7 and how those capabilities can benefit businesses in a variety of use cases, perhaps even in your industry.
Register today and spread the word!
Dell’s SAS migration began shortly after we acquired the advanced analytics product Statistica. Within weeks, we had decided to move all of the company’s analytics users from SAS to Statistica. After assessing how the migration would affect employees, the next challenge was to get everyone on board.
Change can be daunting, especially when it involves embracing unfamiliar technology to accomplish daily tasks. Our main strategy for getting employees on board was to replace fear of an unknown product with curiosity about how best to accomplish analytical tasks with it.
Download the e-book, “SAS to Statistica: The Great Dell Migration – Part 1: People”
Understanding the Reactions
You don’t expect any sweeping change to be widely met with open arms and high-fives, so we were certainly prepared to address concerns from the Dell workforce. When the news was announced, most reactions fell into three buckets:
Most users had never heard of Statistica and many of them felt a deep-seated career-attachment to SAS. Once we realized that, we started working on ways to replace their fear of an unknown product with curiosity about Statistica.
Addressing the Concerns
It was our responsibility to arm employees with as much knowledge about Statistica as possible. We began by arranging communication between our employees and our migration leads from Statistica, to show them that their long years of work would not simply be discarded.
The leads examined the techniques and functions our users had worked with in SAS – K-means clustering, polynomial regression, GLM, ARIMA, neural networks, and more – and demonstrated how to replicate and enhance them in Statistica. Nearly all the techniques they had used in SAS were in Statistica, and were easier to implement. In short, they didn’t need to rewrite thousands of lines of code; they simply dragged and dropped icons on the Statistica workspace.
While the discussions eased their concerns somewhat, getting full-scale buy-in required more comprehensive onboarding. We’ll take a closer look at those strategies in an upcoming post.
Download the Dell-on-Dell case study, “SAS to Statistica: The Great Dell Migration – Part 1: People,” to learn more about anticipating the reaction among your business users when you undertake an analytics migration.
New data and new insights are giving way to new data driven products and contributing to the digital economy. Companies with data driven insights to markets, buying behavior, product performance, and procedure execution are able to leverage this insight to produce and build new innovative products based on this data. These new data products can create meaningful revenue opportunities and enhance customer care and overall corporate execution.
In this week’s #ThinkChat segment Tom and I discuss how companies like GE, Monsanto, Google and Facebook are leading the way with data product innovation and how traditional smaller companies can get in on this opportunity to turn their data into new services, products and revenue streams.
#ThinkChat Conversation with Tom Davenport Part 5 of 7
To view other segments in the #ThinkChat series click here
Those of a certain age may recall a famous ad campaign where the well-coiffed Sy Sperling stated, "I'm not only the Hair Club president, but I'm also a client."
And who can forget pop culture icon, Victor Kiam, of Remington Products fame: "I liked the shaver so much, I bought the company!"
Now add Michael Dell to that lineup. After adding the Statistica analytics platform to the Dell Software portfolio in 2014, the company founder and namesake decided it would be smart to roll out the product in-house in an ambitious proof-of-concept showcase that would answer the question on the minds of many a CTO: Is it possible to migrate a longstanding, major SAS shop to Statistica without needless disruption to culture or infrastructure? Our company answered that question—switching about 300 users and converting 300+ projects across multiple business units in only six months.
Whaaaat? You didn’t hear about this amazing feat? Our Statistica Monthly News subscribers already read about this. (Yes, you can subscribe for free.) In the technology world, using your own product is known as eating your own dog food or drinking your own champagne. And that’s exactly what Dell is doing.
Maybe you have been wondering whether your own company could relieve itself of the costs and complexities of SAS. Well, now you can wrap your brain about that concept by reading exactly how Dell did it. Click through to our July newsletter below and look at the lead story there. While you are at it, be sure to subscribe to the newsletter so you can keep up with other useful info as we move forward.
And remember: “I’m not only the Statistica newsletter editor, but I’m also a subscriber.”
In 2014, Dell acquired the advanced analytics product Statistica, and set out on a project to migrate all of the company’s data to our newly attained solution. In our Analytics Migration Series, we’re taking a closer look at our journey in hopes of offering insight to other organizations embarking on this arduous but sometimes necessary process.
One of the first tasks we faced was analyzing who would be impacted by the migration and defining their job functions. Dell is no different than most companies in that with any software usage there are levels of engagement. Most employees are casual users, working with a subset of the product functions to accomplish their daily tasks. Then, there is a smaller subset of power users who eat, sleep and breathe the product, who will be most affected.
We quickly identified hundreds of users whom we needed to move to Statistica. The largest subset of those users work in Dell Global Analytics (DGA), a group that provides analytic expertise and support to a wide range of functional organizations throughout the company that don’t have their own internal analytics expertise. Here’s an overview of how the DGA team’s expertise impacts the company:
In short, analytics is pervasive throughout Dell and DGA is a big part of our competitive advantage, and the migration affected our data analysts and users in all of those groups. But more important, since analytics is ubiquitous at Dell and embedded in our systems and processes, there are many more people who consume and rely on the analytic output. Any adverse change to this analytic output could drastically impact the business.
As a result, we had to take extra care to ensure the DGA team was fully on board with the process and make certain that their ability to deliver this business-critical data across the organizations was not hindered. This step may be challenging and time consuming, but it increased the chances of a successful data migration and minimized potential business disruptions.
Download the eBook, “SAS to Statistica: The Great Dell Migration – Part 1: People,” for more insights into embarking on your own analytics migration project.
The term “analytics platform migration” can elicit the same reaction as “root canal” or “Can you take me to the airport?” It is—for most companies—necessary at some point, but not particularly pleasant.
In the coming weeks, we’ll share our own Dell-on-Dell story of migrating from one analytics platform, SAS, to the platform we purchased, Statistica. We would like to offer insight into why we decided to make the change and to help organizations take on the arduous process and still maintain business continuity.
Download the Dell-on-Dell Case Study, “SAS to Statistica: The Great Dell Migration – Part 1: People”
Our story begins when Dell acquired the advanced analytics product Statistica (formerly StatSoft). Within weeks of the acquisition, we set out to end our use of SAS over a six-month timeframe and adopt Statistica as the core analytics platform companywide. To offer insight into the scope of our migration project, here are a few high-level results:
The first and most obvious question is why Dell would decide to migrate to Statistica in the first place. From our perspective, it wouldn’t be right to ask our customers to consider using Statistica if we weren’t willing to use it ourselves. When prospects ask whether Dell uses the product they are promoting, our sales teams need to be able to answer, “Yes, we drink our own champagne.”
Other critical factors included:
Enabling the analytics enterprise. We’ve all read about it in books, magazines and news articles — we need to do something about analytics and big data. Organizations that embed analytics within all parts of their business to make faster decisions and improve decision making, planning and forecasting have a distinct competitive advantage. Unfortunately, there is a skills shortage, so we need a software package that plays well with existing IT investments, is secure, and is sufficiently easy to use. The goal is to enable all users — experts and line of business users alike — to make the most of their data.
Lowering the cost of licensing. Even before Dell acquired Statistica, customers and prospects had been telling us that they wanted analytics software that was less expensive and, more important, able to make analytics accessible to more people in their organization. But they didn’t know what a migration project would entail and, like most companies, they were concerned about the downside of an unsuccessful one.
Bang for the buck. Dell, like most companies, derives significant value and competitive advantage from applying analytics to areas of the business like marketing, price optimization, forecasting, supply chain optimization, preventive maintenance and credit risk analysis. We believed we could get better value for less money with the full range of analytic muscle and ease of use we saw in Statistica.
More analysis, less code-writing. SAS is powerful, but you have to hire people to write and maintain code and administer the complex system to get the most out of it. We, along with many of our customers, were having increasing trouble finding good replacements for the SAS-savvy people who were leaving or retiring, so the cost of keeping SAS was rising beyond our annual licensing fees.
It’s the best way to improve the product. You know that the path to a better product leads from your front door to your loading dock. Given that Dell is making significant investments in Statistica, it made all the sense in the world to use it ourselves and see what our customers were experiencing.
Download the case study, “SAS to Statistica: The Great Dell Migration – Part 1: People.” We think you’ll enjoy this rare look into how Dell weaves analytics into almost everything we do, and how much like most companies we are.
I’m often asked how the new age of big data will impact small- and mid-sized business (SMBs). Can they keep up? Can they stay relevant in the age of big data? The answer is a resounding yes. After all, big data really represents all data, regardless of what it looks like or where it resides. We’re talking social media, internet, IoT, images, and digital media, and as well as all those forms of structured data we’re quick to forget about but are still dominating the data management landscape. In no uncertain terms, the SMB is just as interested – if not more interested - in taking advantage of this pool of data. Just like their enterprise counterparts, SMBs are actively searching for the value and business opportunity hidden within data.
In many ways, the new data landscape is completely altering the old competitive landscape as it pertains to SMBs and enterprises. The growth of so-called big data hasn’t made SMBs less competitive or less innovative. In fact, it’s done just the opposite. Thanks to advancements in our ability to capture and analyze data, SMBs can now drive innovation in ways previously reserved for the enterprise. Managing “all data” gives the business, regardless of their size or budget, the ability to better understand their customers, their businesses, and their marketplaces. All of which means that in this new data ecosystem, SMBs are more competitive with bigger, richer enterprises than they’ve ever been in the past.
Which bring us to Microsoft. Microsoft, like Dell, has long been known as champion of the middle market, and, again like Dell, they’re clearly focused on taking that commitment to the next level amid the changing data landscape. You can already see how customers’ need to corral big data is impacting the way Microsoft supports the SMB ecosystem. MSFT has invested in morphing and creating many products to reflect the need to support the evolving needs of the SMB. A great example of this is the company’s aggressive investment in Azure. Other examples of Microsoft investing heavily to meet the changing needs the SMBs can be seen in the company’s Excel and SharePoint brands.
To understand that this all means for Dell (spoiler: this is a great thing for Dell), let’s look more closely at Azure. Azure opens up a cloud-based approach to big data storage, delivering opportunities for the SMB to enjoy a pay-as-you-go approach. The Azure platform opens up a means for SMBs to experience small footprints of the big data ecosystem without committing precious resources to these efforts. These smaller chunks of the data are also more representative of what an SMB might need to store and archive. After leveraging Azure storage, Azure ML allows customer to experiment with big data using machine learning, essentially setting up an analytic sand box for the organization to explore and experiment with analytic capabilities in a meaningful and “by design” method. Here’s where Dell comes into the mix. Dell Statistica allows SMBs to easily leverage its predictive power in tandem with Azure ML’s compute power to build a best-in-breed solution to address advanced analytics.
The combination of Azure and Statistica provides just one great example of Microsoft and Dell technology working together to benefit SMBS. The potential for SMBs to leverage Dell technologies to deliver value on top of their Microsoft investment is virtually limitless, and it’s one of the primary characteristics differentiating Dell in the information management space. Not only is our ability to help you get the most out of your Microsoft investments unique, but so is our willingness. For SMBs and enterprises alike, Dell is the platform-agnostic vendor. Our relationship with Microsoft is stronger than ever, but when we say all data, we mean it. So, whatever investments SMBs make and whatever path they travel down in order to get more out of their data, we can go down it with them. That’s what all data is all about.
See how data growth and new technologies are affecting the DBA ― read the eye-opening study today.
You never know what you will learn from the helpful Statistica newsletter. (Subscribe for free.) Our recent June issue brought to our readers’ attention that legacy StatSoft’s social media properties have been changed up quite a bit. You won’t find us by the same name anywhere anymore!
Well, that’s not entirely true, but readers did learn that our old Facebook page has now expanded to include all our fellow software teammates in Dell’s Information Management Group. So, now when you go visit our page, you will find our Statistica content mixed with that of Dell Boomi, SharePlex, and Toad. It’s like we suddenly discovered an extended family with whom we share much in common—primarily we share the successful end-to-end workflow of YOUR data. We think you should stop by and get to know these family members, too, and then “like” them the way you’ve always liked Statistica.
Find all our new social media links in the June newsletter >
Integration, Analytics and Process are all part of the Internet of Things (IoT) value chain. Pulling it all together is much harder than it seems and will present a significant challenge to many companies looking to derive value from IoT initiatives. Successful companies will need to employ a strategy that leverages a flexible infrastructure that manages the data and the analytics at the edge and embedded into critical applications.
In this week’s #ThinkChat segment Tom and I discuss why the Analytics of Things are more important than the Internet of Things (IoT). Neither of us discounts the value of the infrastructure or the data but in the end actionable insights are what drive the ROI and it’s impossible to get there without the analytics.
#ThinkChat Conversation with Tom Davenport Part 4 of 7
To view other segments in the #ThinkChat series click here.
The headliner in the latest Statistica e-newsletter was hard to miss, announcing the official release of Statistica 12.7. Thirty-one years in the making and our analytics platform just keeps getting better! There were no trumpets or parades, but that doesn't mean there is not some really cool stuff in there. I won’t go into details about 12.7 here —that's what the newsletter is for! (Yes, you can subscribe for free.)
By late 1987, StatSoft had expanded and integrated all its lines of software into one large statistical package called CSS (Complete Statistical System). CSS included prototypes of many of the unique input, output, and analysis control features that would later become trademarks of StatSoft’s software technology.
As CSS (and MacSS for the Macintosh) became popular, StatSoft was already devoting all R&D resources to the development of a new, advanced line of statistics software: STATISTICA, which was to offer entirely new levels of functionality not available in any other data analysis software at the time.
And yes, in case you were wondering, the trademarked product name was always presented in italicized capital letters, even in body copy.
The first (DOS) version of STATISTICA was released in March 1991, followed by STATISTICA/Mac in January 1992. Finally, STATISTICA for Windows (aka STATISTICA 4.0) was pre-released in 1993, representing the crowning achievement of StatSoft’s R&D efforts. The platform’s graphics technology and numerical integration, the flawlessness of its user interface, and the capacity and speed of its computational modules all set new standards for numerical and graphical analysis software.
The rest, as they say, is history. Subsequent releases of STATISTICA continued to address enterprise, connectivity, and scalability needs in the world economy, and the platform continued to set new performance, quality, capacity, and comprehensiveness standards for statistics, graphics, and analytic data management software. No wonder it eventually caught Dell’s attention.
StatSoft had steadily worked the software up to version 12 before the Dell acquisition in March 2014, after which the Statistica name itself got a makeover: no more italicized caps. Now the latest release this year is Statistica 12.7. And next…? Stay tuned.
These days there’s a lot of talk about big data and its effect on privacy. After all, we now work with vast amounts of data that wasn’t practical, accessible or simple to leverage in the recent past. It used to be true that companies only tapped into 20 percent of their data resources, leaving the remaining 80 percent because it was too costly and difficult to utilize fully.
Not anymore. Today, innovative companies are striving to use all of their data (#AllData). Advances in data mining and big data analytics are enabling innovation at the speed of business. We can mash-up, manipulate and mine information to do great, new insightful things. We can take advantage of derived data, which leverages several points of data to create new data about just about everything, including buying patterns, consumer preferences, business directions—the list goes on and on.
But before we get carried away with the endless possibilities, let’s remember a quote from Voltaire: “With great power comes great responsibility.” Companies that start down this path—and it’s a crowded one these days—must walk a fine line between innovation and icky.
Most everyone appreciates when Amazon makes suggestions for additional purchases based on behavior data. In other scenarios, data that is derived can come as a complete surprise—such as when a retailer uses shopping basket analysis to determine that you have a cold or a baby on the way. When Amazon uses data about you, it feels innovative. When a retailer creates data about you, it feels downright icky.
With the advent of big data technologies, there’s more and more data included in analytic work streams that simply wasn’t available before. Issues around privacy are very fluid right now. Common sense and best practices should prevail during conversations about where the boundaries of innovation and “ickiness” cross.
According to a recent AP story on high-tech fitting rooms, some retailers are testing versions of technically advanced fitting rooms with “smart mirrors” to compete with online retailers. The technology enables a brick-and-mortar retailer to collect much of the same behavior data as online retailers and then use it to recommend other products. So, would you appreciate a mirror that suggests a pair of jeans to go with the red shirt you just tried on or is that an invasion of privacy?
Consumer advocates already are voicing concerns about who ultimately has control over the data collected. Governments worldwide are starting to pass legislature and guidelines around digital privacy. It’s early, but the conversations need to continue so regulations can be developed to protect people from what they don’t fully understand.
Recently, I bought new doorknobs for my kitchen cabinets and for weeks, I was inundated with online ads for doorknobs, even if I was visiting a sports, cooking or news website. Most people don’t know that major websites share data as part of behavioral ad targeting. I personally think it’s cool when Amazon suggests a book I might like based on a previous purchase. But, a sports site trying to sell me more doorknobs falls into the icky camp.
That’s why it’s so important to understand both the context and circumstance of how data will be used. I spoke with an educator from a small school district on the east coast where analytics were being gathered on K-6 students. The goal was to data mine all available information on a student to identify the Key Performance Indictors (KPI) that would correlate to how likely he or she was to graduate high school A fantastic utilization of data, isn’t it? But there also is a potential downside. How do you share it? Or should you share it? Do the parents deserve to know? Will the knowledge affect how teachers interact with students?
According to an article in Time, a movement is stirring in about 125 schools around the country. Officials are sifting through years of grades from thousands of former students to predict what will happen to current classmates. One university uses data to determine which students would have a higher propensity of graduating while other schools have learned to minimize costs of recruiting new students who they believe are more at risk.
While big data and analytics are incredible, there is a double-edged sword around proper use of this information. For example, there’s a teachers’ union that is working with its state to change the compensation policy to one that is more performance-based. While that would seem all well and good, what if a school gathers data that reveals which students will not do well and then teachers don’t want these students because they could negatively impact their compensation? Or what if students find out their predicted fate and it turns into a self-fulfilling prophecy?
At Dell, we understand innovation comes with responsibility. We strive to keep our finger on the pulse of governance and privacy best practices so we don’t cross the boundaries from innovation to icky. Have any thoughts on walking this fine line? If so, drop me a line at firstname.lastname@example.org.
What do the 43rd President of the United States and I have in common? We like hanging out with healthcare technology professionals!! President George W. Bush gave the closing keynote at this year’s Health Information and Management Systems Society (HIMSS) event in Chicago and Dell was one of the Corporate Sponsors. It’s been a few years since my last visit to HIMSS and I have to say I was extremely impressed. The event is attended by over 42,000 people and seems to cover every square foot of the McCormick Center. Dell had an extremely strong presence at the program, hosting a charity in the booth, a dozen different Dell HCLS solution demo's and 3 live tweetups.
The technology themes were varied throughout the event and I was there to help lead a discussion on Population Health with Dell experts Dr. Gary Miner, Dr. Tom Hill and Dell's acting Chief Medical Officer Dr. Charlotte Hovet. We were also joined by Dr. Ken Yale, Vice President of Clinical Solutions, ActiveHealth Management. It’s interesting to see where data is playing a role in driving more consistent and higher quality patient care. Population health obviously benefits from data driven insights. Technology's like advanced analytics are helping us move beyond an understanding of large populations and to focus in on more personalized patient care via diverse data and insightful analytics. As we are able to leverage more data and a greater variety of information on specific patients the ability to personalize care and apply a customized level of best practices will result in much better overall patient care.
L-R Shawn Rogers, Dr. Gary Miner, Dr. Tom Hill, Dr. Charlotte Hovet and Dr. Ken Yale
The end result as advanced analytics drives patient care forward will be precision healthcare where care givers are able to execute specific regimes for each individual based on their specific needs, chemistry, DNA and other personalized markers and prerequisites. It's exciting to think that advanced analytics has the ability to enhance treatment and deliver personalized healthcare. Innovation isn't without its hurdles, connectivity to data and a new responsibility of patients to bring their own data into play will prove difficult. New trends will include device information on a patient’s exercise and activities, diet, location, travel history and more. Advanced analytic platforms will factor many new data points into models in order to achieve the highly specific care plans required by precision medical treatments. Look for care givers to push back a bit as the culture of human knowledge and instincts collides with automated and model driven best practices. I believe that ultimately both voices need to be heard in order to supply the best possible care. Dr. Hovet made the point that even though analytic platforms will supply a path for treatment Dr’s will still play a critical role in communicating, implementing and executing precision medical treatment. The days of the robot doctor are still way out in our future.
Having been in the data business for as long as I have, I found the themes at HIMSS to be exciting and full of promise for future and immediate innovations based on our ability to leverage greater amounts of data and a wider more dynamic range of information in order to add value to overall patient care. These are exciting data driven days for health care.
Perhaps some of you remember this successful advertising campaign of a bygone era: "The Maidenform Woman: You never know where she'll turn up." I was reminded of this while compiling the June issue of the Statistica e-newsletter.
Those of you in the know are surely raising your eyebrows by now. That's because Maidenform was promoting women's undergarments, not analytics solutions. Nonetheless, the shared concept of ubiquity is where I found the Statistica connection.
Specifically, I was prepping a lengthy list of events in EMEA and North America, everything from tradeshows and conferences to workshops and webcasts. But it was the EMEA events, both big and small, that struck me with their variety of non-analytic-sounding titles reflecting different industry verticals: HIMMS, Interop, Oil & Gas, Cloud World, IsisTech & Oxford AHSN eHealth. Of course, it helps to know what the acronyms stand for, but here's my point: Statistica's analytics solutions and data tech compatibility are applicable in just about every industry across the spectrum, so almost every business event out there is relevant.
That is why Dell Software is sponsoring these events and many more. It is important that we meet decision makers where they are and help them envision the suitability of our solutions, even at functions that—at first glance—might not seem to be practical venues for exposure. Our calendar of events is certainly in keeping with our current mission to embed analytics everywhere, empower more people, and innovate faster. And as we continue to increase our reach through such varied opportunities, Statistica becomes like the Maidenform Woman: you never know where we'll turn up!
Read (and subscribe to) the June newsletter >
Everyone loves superheroes. As if we need proof, Avengers: Age of Ultron has already raked in more than $1.1 billion. Perhaps it’s because we all love to fantasize about having superpowers. Who wouldn’t want super strength like Captain America? How about the ability to fly, so you could whip past traffic on your morning commute?
Our daydreams could go on and on, but we suspect there’s one superpower that would help you right now: the ability to quickly and easily analyze the massive volumes of data your organization collects each day. Imagine if you had Hulk-like strength to smash down data barriers. You could effortlessly collect, integrate, analyze and use all that data. But without the right powers ― and with constant demands to help generate business insights ― your data battles may seem more daunting than a faceoff with Ultron himself.
Well, that whole fantasy about ruling your data universe is about to become a reality. That’s because we’re delivering the power you need to easily turn data into actionable information. And we’re talking about virtually all data here: data coming from traditional on-premises sources as well as cloud-based data sources and modern data stores like Hadoop and NoSQL. So if you’re ready to get your analytic superpowers on, check out our latest version of Statistica, the advanced analytics platform.
Two important enhancements in Statistica 12.7 will empower you to take data analytics to the next level. We partnered with Datawatch Corporation to boost the advanced analytic capabilities of Statistica with enhanced interactive visualization and dashboarding. Rich, visual representations of various data streams will help you easily identify opportunities and hidden patterns. This release also offers self-service data preparation and real-time streaming to put data analysis power in business users’ hands.
In his recent article, “Dell Brings Advanced Visualization to Analytics Platform,” CIO’s Thor Olavsrud noted that the addition of these advanced interactive visualization tools and dashboard capabilities extend the applicability of Statistica to additional users, including business analysts.
Statistica 12.7 also integrates with our Toad and Boomi product lines, delivering connectivity with more than 164 data sources, cloud or on-premises, in motion or at rest. Further development of the Statistica big data analytics module, enhanced text-mining capabilities, natural language processing and search tools, expand the product’s ability to derive insights from unstructured data. The coolest part is that the Statistica big data analytics module brings advanced analytics to the data, rather than the data to the math.
We already have customers using the newly integrated Datawatch capabilities. Don your own analytical superpowers with a free trial of Statistica 12.7 today.
As quants become more critical to your overall analytic environment there will be growing pains between them and the line of business executives they serve. Many business sponsors see quants as alien beings, math magicians from another planet. Quants can and do deliver extremely useful insights but it remains the responsibility of the business leader to transition that insight into valuable action. Feeding the need of an executive to “look” informed is a recipe for disaster. Its critical for a business leader to stay focused on action not just the collection of information.
In this week’s #ThinkChat segment Tom Davenport and I discuss the dynamics of quants and their business partners and touch on a few companies who are doing it right and getting high value from their investments in Big Data and Quants.
#ThinkChat Conversation with Tom Davenport Part 2 of 7
At this year’s Dell Annual Analyst Conference (DAAC), Michael Dell was crystal clear about the immense opportunity the Internet of Things (IoT) represents. In fact, he called it the trillion dollar opportunity.
Not surprisingly, IoT was one of the big trending topics also discussed by analysts, sourcing advisors, Dell partners and customers alike. It was reflected in keynote speeches, breakout sessions and in demos. The building automation demo by Dell OEM partner KMC Controls showcased how data taken from sensors in a building, aggregated via a gateway and then analyzed can help make a building more energy efficient and safer. Next to this demo, Dell Software showed how an advanced analytics solution built on top of the KMC Controls solution can provide comprehensive data integration and predictive analytics-based insights.
The Internet of Things
Let’s step back a bit. IoT is not new and although it is often framed as an emerging trend, it is no longer a prospect of the future. Many companies already have sensors on their equipment used for predictive maintenance. The Internet of Things can be described as an ecosystem where sensors, devices and equipment are connected to a network, and can transmit and receive data for tracking, analyzing, and taking business actions. What does this data journey look like?
The data journey
The journey of the data begins at the sensor connected to a device, for example an air conditioning unit or refrigerator. Now the data has to travel via a wireless or connected medium to get to an aggregation point, either a gateway or a datacenter. Gateways, such as the just launched Dell IoT Gateway , are small, wireless or connected devices that collect, help secure and process sensor data at the edge of a network. They represent one way of collecting data and can be used as a smart device to perform real-time monitoring and analytics of streaming data.
Next, the traveling sensor data has to be integrated into a much larger pool of data, including non-sensor data such as weather, social media, CRM, business or other machine-to-machine (M2M) data. A platform like Dell Boomi allows for seamless, real-time data integration and normalization on either the gateway, datacenter or in the cloud. At this level, Boomi also adds additional value in terms of ensuring industry data compliance (HIPAA, Safe Harbor, PCI, SOC II)
The data value resides in analytics
Nicely integrated, this data is now ready for analytics (Dell Statistica). This video explains the value proposition.
In short, here is how it works: There are two ways to analyze the data: 1. perform real-time analytics on streaming data (time series, see examples 1 and 2 below) at the edge (gateway) and 2. perform deeper analytics on the historic data set that can be done in the datacenter to help predict maintenance events or forecast business trends.
An example of streaming data analytics is shown in figures 1 and 2. Figure 1 (follow link) shows the monitoring of refrigerator compressor temperature against ambient temperature in real time, setting off alerts to the building operator in case of anomalies, such as an unhealthy compressor.
Figure 2 (also shown above) shows real-time operational analytics mashing coffee pricing data (milk, paper cup, coffee grounds, etc.) with sensor data from inside the manufacturing facility to monitor the overall health of the manufacturing facility.
Figure 3 (follow link) provides an example of predictive analytics performed on data at rest. This example shows how we can use multivariate process monitoring and control statistics to predict machine health. The analytics show that there is a definite correlation between the speed of the cup movement by the robotic arm inside the coffee machine and the ice dispensing speed which points to the need for maintenance of the robotic arm.
KMC Controls, via its building automation solution, controls and monitors the devices and sensors that are installed in a building. There is an opportunity to develop a centralized view of the entire building to provide a comprehensive IT management solution which includes IoT devices and sensors as well as all IT assets that are managed inside the building.
Many devices work on their own secure network, and building automation manufacturers like KMC Controls need to consider what security measures must be implemented for detecting and blocking malicious activity over non-standard network protocols. A natural starting point is to consider extending the existing firewall technologies to comprehend these new devices.
Learn more about IoT solutions
To assist companies in overcoming the hurdles in setting up industrial IoT technologies, Dell has set up the Dell Internet of Things (IoT) Lab. Companies can come to the lab - located in the Dell Silicon Valley Solution Center in Santa Clara, CA - to receive assistance in architecting solutions to their IoT needs.
To learn more about the Dell IoT Lab, please see http://dell.com/iot. To learn more about the software solutions Dell offers in this space, please visit http://software.dell.com/
In this second part of our interview series with Dr. Charlotte Hovet, medical director of Global Healthcare Solutions at Dell, we examine what healthcare will look like in 2020 and offer tips for getting started.
What will healthcare look like in 2020?
The world of healthcare will look different in five years and significantly different in 10 years as providers and patients adapt to disruptive change. Technology, consumerism and new payment models are reshaping the delivery of healthcare, and as a result, we can anticipate better care, better health and lower costs. However, that doesn’t mean there won’t be significant stumbling blocks along the way.
I’ve traversed the globe for nearly a decade advocating change and while physicians are quick to adopt new medical devices, they’ve been slow to embrace Electronic Health Records (EHR). And, often for good reasons. New technologies must enhance the effectiveness and efficiency of clinical practice and align with people, process and policy changes. As such, challenges will linger over the next five years as phase two of the EHR continues and progress is made on the integration, interoperability and security fronts.
Full adoption of patient-centered tools will take time and patience as well as assistance to overcome steep learning curves. Moving to a digital world is certainly disruptive but the upside is tremendous—a world of true clinical collaboration and innovation. The ability to deliver integrated services will spawn new care-giving models with expanded scopes and teams outside the traditional clinic and hospital settings.
New technologies like telemedicine will emerge that enable people to have care on a daily basis where they need it most—in their homes. Care teams will be able to reach out to patients in remote areas with a focus on prevention and continuity of care. Whether a person has an acute problem or a chronic disease, the capacity for home care will be greatly enhanced. It will be an exciting time in healthcare as we experience value-based, rather than volume-based, service delivery.
Would you say that value-based healthcare is data-driven healthcare?
Absolutely! Analytics and informatics will be the primary drivers in this newly expanded healthcare view. The knowledge we derive from data changes everything—how we interact with patients and how we diagnose and treat them. Let’s look at University of Iowa Hospitals and Clinics where Dr. John Cromwell is using Dell Statistica to better predict which patients face surgery risks and then determine which medications or wound treatments would be most effective in reducing their chances of acquiring a hospital-acquired infection.
How will predictive analytics be useful in lowering healthcare costs?
Instead of focusing most of our efforts on the high-cost, high-risk group, which currently accounts for three-quarters of our healthcare spending, predictive analytics will enable organizations to focus on the rising risk—the middle group—which is often ignored. If we can identify those people who are at risk for chronic disease and actively intervene before they become high risk, we can make major headway in lowering the cost of healthcare delivery while dramatically improving quality.
What’s the best way to get started?
Healthcare transformation requires alignment of people, processes and technology, which we discussed in a recent webinar. We recommend starting with a readiness assessment to reveal where you do—and don’t—have alignment across the organization.
This can be determined by asking basic questions, such as: What clinical analytics do you need and who holds the key to that information? Who on your staff will mine the data and look for trends? What will be done with that information to change the delivery of healthcare services? What role does governance play in all of this? And, what steps need to be taken once all this knowledge is passed along to the appropriate clinical improvement teams? How will they collaborate to identify trends, turn insights into action and change care delivery?
This iterative process needs to involve the physicians and nurses who are directly involved in delivering care. Future healthcare will be highly collaborative and empower healthcare professionals to identify best practices through analytics as well as how they can use this information to improve decision making and patient care outcomes.
How is Dell helping customers accelerate healthcare transformations?
Dell does a great job of guiding customers along their data-driven journeys by bringing together hardware, software and services to address their needs today while providing an IT platform for the future. Today, we’re proud to be working with some of the leading healthcare organizations, including Dignity Health, HealthMarkets, Beth Israel, Boston Medical Center and more.
In my travels, I meet with chief analytics officers, chief technology officers and chief medical information officers. I tell them about a population analytics project with a hospital in the south. I share highlights of a recent pilot using advanced predictive analytics to identify those at risk for exasperation of asthma or diabetes and the impact on hospital readmissions. I explain how the University of Iowa Hospitals and Clinics has lowered surgery infection risks and subsequently surgery costs.
It’s exciting to share insights, ideas and engage others in healthcare transformation today, so we all can benefit from a new world of healthcare delivery by 2020.
What do you think healthcare will look like by 2020? Email me at email@example.com to offer your forecast.