Blog - Post List
  • Information Management

    In the World of Big Data Analytics, Azure Is More Than Just a Pretty Color

    In the latest issue of Statistica Monthly News (yes, you can subscribe for free), our readers found a link to a webcast that talks all about Statistica’s new partnership with Microsoft, a relationship that produces some incredible hybrid cloud functionality for data analysis using Azure Machine Learning (ML).

    We are talking about a hybrid cloud solution whose powerful functionality completely belies Azure’s namesake: a shade of bright blue often likened to that of a cloudless sky. Cloudless? Hardly. The Statistica-Microsoft partnership is all about the Cloud!

    The fun story in the webcast describes how one website was running an analytics program as an API on Azure. Designed to guess ages and genders of people in photographic images, the site was expecting a few thousand submissions, but it went from zero to 1.2 million hourly visitors within just two days of going live, and up to seven million images per hour. By day six, 50.5 million users had submitted over 380 million photos! Normally, we would hear about sites crashing with such a viral overload. But this site kept humming along even when the action ramped up so dramatically, primarily because Azure scaled dynamically as intended, handling the unforeseen load like a champ.

    Think about embedding this kind of cloud access and flexible scalability as a directly callable function inside Statistica—well, that just makes way too much sense, right? But that is what’s happened! Azure ML is really a development environment for creating APIs on Azure, with the intent to enable users to have machine learning in any application, whether that is a web app or a complex workflow driven by Statistica. For instance, you can host your complicated models in the cloud with Azure and run non-sensitive, big data analytics out there—a very practical time saver and money saver. Then you can bring those analyzed results back down to join perhaps more sensitive data and analytics output behind your firewall. You can learn more when you watch our “Cloud Analytics” webcast

  • Information Management

    As a Slogan, “Big Data” Still Carries a Big Punch

    You might have heard that the term "big data" is over-hyped. And maybe you have heard it already slid into the “trough of disillusionment” (as far back as early 2013, if you believe everything you read on the Internet). Even assuming these assessments are true, the fact remains that the term itself remains relevant and apt for many a business person still seeking best tips and practices for developing analytics projects.

    In marketing-speak, the “big data” slogan has stickiness. But for all its ubiquity—or, perhaps, because of it—the term "big data" remains something of an enigma, a source of curiosity for business leaders and data executives worldwide. That is to say, business people wanting to get into analytics still respond to that term more than others.

    To be fair, the longevity of “big data” works in its favor. Currently, people search for “big data” on Google an average of 60.5K times per month, perhaps because the term seems all-encompassing and broadly descriptive, a good place to start asking questions. Meanwhile, more recent phrases—despite their own merits and relevance—are not sought out nearly as often. For instance, “internet of things” currently averages only 40.5K monthly searches, and “predictive analytics” clocks in at 9.9K. And even if you think “cloud analytics” is destined to be the Google rage someday, right now that phrase averages only 390 searches per month. (That’s not a typo: it is 390.)

    This popularity is why we still like to use the “big data” moniker when talking about Statistica’s analytics prowess. Did you read our July issue of Statistica Monthly News? (Yes, you can subscribe for free.) In the sidebar list of events, our subscribers have already seen that we are offering a free Tech Webcast on July 30, “Statistica Eats Big Data for Breakfast.” This webcast will be presented by Mark Davis, the founder of Kitenga and now Distinguished Engineer at Dell Software. He will be focusing on the newer big data capabilities within Dell Statistica 12.7 and how those capabilities can benefit businesses in a variety of use cases, perhaps even in your industry.

    Register today and spread the word!

  • Information Management

    Analytics Migration Series: Anticipating Business User Reaction

     Dell’s SAS migration began shortly after we acquired the advanced analytics product Statistica. Within weeks, we had decided to move all of the company’s analytics users from SAS to Statistica. After assessing how the migration would affect employees, the next challenge was to get everyone on board.


    Change can be daunting, especially when it involves embracing unfamiliar technology to accomplish daily tasks. Our main strategy for getting employees on board was to replace fear of an unknown product with curiosity about how best to accomplish analytical tasks with it.


    Download the e-book, “SAS to Statistica: The Great Dell Migration – Part 1: People”


    Understanding the Reactions

    You don’t expect any sweeping change to be widely met with open arms and high-fives, so we were certainly prepared to address concerns from the Dell workforce. When the news was announced, most reactions fell into three buckets:

    • “But we’ve never used Statistica.” Our co-workers weren't familiar with how robust an analytic platform Statistica is, so naturally they were skeptical. That’s why we hired them. They knew that their work consisted of high-end analytics in SAS and assumed (incorrectly, as it turned out) that Statistica wasn’t up to it.

    • “We’ve spent years writing thousands of lines of SAS code. We don’t want to just throw that away.” Our users balked at all the work of trying to replicate in Statistica the advanced analytics functions they had built in SAS. Who wouldn’t feel that way?

    • “We consider ourselves SAS professionals and analysts first, and employees of Dell second. For career longevity and our ability to do our job, we believe that it's really important to continue using SAS.” That’s a tough one. We found a number of heavy SAS users who had been working with the product for over 20 years. They were comfortable using it and they had grown, evolved and become pretty good with it over much of their career. Asking them to switch to something they didn't know was a huge imposition.


    Most users had never heard of Statistica and many of them felt a deep-seated career-attachment to SAS. Once we realized that, we started working on ways to replace their fear of an unknown product with curiosity about Statistica.

    Addressing the Concerns

    It was our responsibility to arm employees with as much knowledge about Statistica as possible. We began by arranging communication between our employees and our migration leads from Statistica, to show them that their long years of work would not simply be discarded.

    The leads examined the techniques and functions our users had worked with in SAS – K-means clustering, polynomial regression, GLM, ARIMA, neural networks, and more – and demonstrated how to replicate and enhance them in Statistica. Nearly all the techniques they had used in SAS were in Statistica, and were easier to implement. In short, they didn’t need to rewrite thousands of lines of code; they simply dragged and dropped icons on the Statistica workspace.

    While the discussions eased their concerns somewhat, getting full-scale buy-in required more comprehensive onboarding. We’ll take a closer look at those strategies in an upcoming post.

    Download the Dell-on-Dell case study, “SAS to Statistica: The Great Dell Migration – Part 1: People,” to learn more about anticipating the reaction among your business users when you undertake an analytics migration.

  • Information Management

    #ThinkChat Innovating with Data Driven Products with @ShawnRog and @Tdav

    New data and new insights are giving way to new data driven products and contributing to the digital economy. Companies with data driven insights to markets, buying behavior, product performance, and procedure execution are able to leverage this insight to produce and build new innovative products based on this data. These new data products can create meaningful revenue opportunities and enhance customer care and overall corporate execution.                                                                                               

    In this week’s #ThinkChat segment Tom and I discuss how companies like GE, Monsanto, Google and Facebook are leading the way with data product innovation and how traditional smaller companies can get in on this opportunity to turn their data into new services, products and revenue streams. 

     #ThinkChat Conversation with Tom Davenport Part 5 of 7      

    To view other segments in the #ThinkChat series click here


  • Information Management

    What Do Sy Sperling, Victor Kiam and Michael Dell Have In Common?

    Those of a certain age may recall a famous ad campaign where the well-coiffed Sy Sperling stated, "I'm not only the Hair Club president, but I'm also a client."

    And who can forget pop culture icon, Victor Kiam, of Remington Products fame: "I liked the shaver so much, I bought the company!"

    Now add Michael Dell to that lineup. After adding the Statistica analytics platform to the Dell Software portfolio in 2014, the company founder and namesake decided it would be smart to roll out the product in-house in an ambitious proof-of-concept showcase that would answer the question on the minds of many a CTO: Is it possible to migrate a longstanding, major SAS shop to Statistica without needless disruption to culture or infrastructure? Our company answered that question—switching about 300 users and converting 300+ projects across multiple business units in only six months.

    Whaaaat? You didn’t hear about this amazing feat? Our Statistica Monthly News subscribers already read about this. (Yes, you can subscribe for free.) In the technology world, using your own product is known as eating your own dog food or drinking your own champagne. And that’s exactly what Dell is doing.

    Maybe you have been wondering whether your own company could relieve itself of the costs and complexities of SAS. Well, now you can wrap your brain about that concept by reading exactly how Dell did it. Click through to our July newsletter below and look at the lead story there. While you are at it, be sure to subscribe to the newsletter so you can keep up with other useful info as we move forward.

    And remember: “I’m not only the Statistica newsletter editor, but I’m also a subscriber.” 

  • Information Management

    Analytics Migration Series: Preventing Disruption of Data Analytics

    In 2014, Dell acquired the advanced analytics product Statistica, and set out on a project to migrate all of the company’s data to our newly attained solution. In our Analytics Migration Series, we’re taking a closer look at our journey in hopes of offering insight to other organizations embarking on this arduous but sometimes necessary process.


    One of the first tasks we faced was analyzing who would be impacted by the migration and defining their job functions. Dell is no different than most companies in that with any software usage there are levels of engagement. Most employees are casual users, working with a subset of the product functions to accomplish their daily tasks. Then, there is a smaller subset of power users who eat, sleep and breathe the product, who will be most affected.


    Download the e-book, “SAS to Statistica: The Great Dell Migration – Part 1: People”


    We quickly identified hundreds of users whom we needed to move to Statistica. The largest subset of those users work in Dell Global Analytics (DGA), a group that provides analytic expertise and support to a wide range of functional organizations throughout the company that don’t have their own internal analytics expertise. Here’s an overview of how the DGA team’s expertise impacts the company: 


    • Supply chain. The DGA provides useful insight that improves our manufacturing by predicting potential disruptions to supply chains around the world.
    • Technical support. Our services teams embed critical data into customer solutions as part of services engagements and for preventative maintenance on hardware.
    • Financial services. DGA provides critical analytics for modeling, assessing credit risk and detecting fraud. Their models are closely tied to forecasts and bank rates, so statistical analysis is part of what they do day in and day out.
    • Marketing. The BI utilized by marketing gives clarity to what customers and prospects are saying, tweeting, looking for and buying. We use analytics to personalize offers, attract prospects and keep existing customers. We study things like customer churn, cross-sell and upsell opportunities.
    • Risk management: Not all decisions are perfect, and we use DGA analytics to lower the potential cost and risk of the mistakes we know we’re going to make and the probability that those risks may actually occur.


    In short, analytics is pervasive throughout Dell and DGA is a big part of our competitive advantage, and the migration affected our data analysts and users in all of those groups.  But more important, since analytics is ubiquitous at Dell and embedded in our systems and processes, there are many more people who consume and rely on the analytic output. Any adverse change to this analytic output could drastically impact the business.


    As a result, we had to take extra care to ensure the DGA team was fully on board with the process and make certain that their ability to deliver this business-critical data across the organizations was not hindered. This step may be challenging and time consuming, but it increased the chances of a successful data migration and minimized potential business disruptions.


    Download the eBook, “SAS to Statistica: The Great Dell Migration – Part 1: People,” for more insights into embarking on your own analytics migration project.


  • Information Management

    Analytics Migration Series Part I: Why Dell Made the Jump to Statistica

    The term “analytics platform migration” can elicit the same reaction as “root canal” or “Can you take me to the airport?” It is—for most companies—necessary at some point, but not particularly pleasant.

    In the coming weeks, we’ll share our own Dell-on-Dell story of migrating from one analytics platform, SAS, to the platform we purchased, Statistica. We would like to offer insight into why we decided to make the change and to help organizations take on the arduous process and still maintain business continuity.


    Download the Dell-on-Dell Case Study, “SAS to Statistica: The Great Dell Migration – Part 1: People”

    Our story begins when Dell acquired the advanced analytics product Statistica (formerly StatSoft). Within weeks of the acquisition, we set out to end our use of SAS over a six-month timeframe and adopt Statistica as the core analytics platform companywide. To offer insight into the scope of our migration project, here are a few high-level results:

    • Approximately 300 users migrated
    • Substantial, bottom-line impact due to saved fees
    • 300+ projects across multiple business units migrated from SAS to Statistica
    • Migration project team consisting of 12 points of contact for users


    The first and most obvious question is why Dell would decide to migrate to Statistica in the first place. From our perspective, it wouldn’t be right to ask our customers to consider using Statistica if we weren’t willing to use it ourselves. When prospects ask whether Dell uses the product they are promoting, our sales teams need to be able to answer, “Yes, we drink our own champagne.”

    Other critical factors included:

    Enabling the analytics enterprise. We’ve all read about it in books, magazines and news articles — we need to do something about analytics and big data. Organizations that embed analytics within all parts of their business to make faster decisions and improve decision making, planning and forecasting have a distinct competitive advantage. Unfortunately, there is a skills shortage, so we need a software package that plays well with existing IT investments, is secure, and is sufficiently easy to use. The goal is to enable all users — experts and line of business users alike — to make the most of their data.

    Lowering the cost of licensing. Even before Dell acquired Statistica, customers and prospects had been telling us that they wanted analytics software that was less expensive and, more important, able to make analytics accessible to more people in their organization. But they didn’t know what a migration project would entail and, like most companies, they were concerned about the downside of an unsuccessful one.

    Bang for the buck. Dell, like most companies, derives significant value and competitive advantage from applying analytics to areas of the business like marketing, price optimization, forecasting, supply chain optimization, preventive maintenance and credit risk analysis. We believed we could get better value for less money with the full range of analytic muscle and ease of use we saw in Statistica.

    More analysis, less code-writing. SAS is powerful, but you have to hire people to write and maintain code and administer the complex system to get the most out of it. We, along with many of our customers, were having increasing trouble finding good replacements for the SAS-savvy people who were leaving or retiring, so the cost of keeping SAS was rising beyond our annual licensing fees.

    It’s the best way to improve the product. You know that the path to a better product leads from your front door to your loading dock. Given that Dell is making significant investments in Statistica, it made all the sense in the world to use it ourselves and see what our customers were experiencing.

    Download the case study, “SAS to Statistica: The Great Dell Migration – Part 1: People.” We think you’ll enjoy this rare look into how Dell weaves analytics into almost everything we do, and how much like most companies we are.

  • Information Management

    Dell, Microsoft and the “All Data” Needs of SMBs

    Microsoft and DellI’m often asked how the new age of big data will impact small- and mid-sized business (SMBs). Can they keep up? Can they stay relevant in the age of big data? The answer is a resounding yes. After all, big data really represents all data, regardless of what it looks like or where it resides. We’re talking social media, internet, IoT, images, and digital media, and as well as all those forms of structured data we’re quick to forget about but are still dominating the data management landscape. In no uncertain terms, the SMB is just as interested – if not more interested - in taking advantage of this pool of data. Just like their enterprise counterparts, SMBs are actively searching for the value and business opportunity hidden within data. 

    In many ways, the new data landscape is completely altering the old competitive landscape as it pertains to SMBs and enterprises. The growth of so-called big data hasn’t made SMBs less competitive or less innovative. In fact, it’s done just the opposite. Thanks to advancements in our ability to capture and analyze data, SMBs can now drive innovation in ways previously reserved for the enterprise. Managing “all data” gives the business, regardless of their size or budget, the ability to better understand their customers, their businesses, and their marketplaces. All of which means that in this new data ecosystem, SMBs are more competitive with bigger, richer enterprises than they’ve ever been in the past.

    Which bring us to Microsoft. Microsoft, like Dell, has long been known as champion of the middle market, and, again like Dell, they’re clearly focused on taking that commitment to the next level amid the changing data landscape. You can already see how customers’ need to corral big data is impacting the way Microsoft supports the SMB ecosystem. MSFT has invested in morphing and creating many products to reflect the need to support the evolving needs of the SMB. A great example of this is the company’s aggressive investment in Azure. Other examples of Microsoft investing heavily to meet the changing needs the SMBs can be seen in the company’s Excel and SharePoint brands.

    To understand that this all means for Dell (spoiler: this is a great thing for Dell), let’s look more closely at Azure. Azure opens up a cloud-based approach to big data storage, delivering opportunities for the SMB to enjoy a pay-as-you-go approach. The Azure platform opens up a means for SMBs to experience small footprints of the big data ecosystem without committing precious resources to these efforts. These smaller chunks of the data are also more representative of what an SMB might need to store and archive. After leveraging Azure storage, Azure ML allows customer to experiment with big data using machine learning, essentially setting up an analytic sand box for the organization to explore and experiment with analytic capabilities in a meaningful and “by design” method. Here’s where Dell comes into the mix. Dell Statistica allows SMBs to easily leverage its predictive power in tandem with Azure ML’s compute power to build a best-in-breed solution to address advanced analytics.

    The combination of Azure and Statistica provides just one great example of Microsoft and Dell technology working together to benefit SMBS. The potential for SMBs to leverage Dell technologies to deliver value on top of their Microsoft investment is virtually limitless, and it’s one of the primary characteristics differentiating Dell in the information management space. Not only is our ability to help you get the most out of your Microsoft investments unique, but so is our willingness. For SMBs and enterprises alike, Dell is the platform-agnostic vendor. Our relationship with Microsoft is stronger than ever, but when we say all data, we mean it. So, whatever investments SMBs make and whatever path they travel down in order to get more out of their data, we can go down it with them. That’s what all data is all about. 

    Next Steps

    See how data growth and new technologies are affecting the DBA ― read the eye-opening study today.



  • Information Management

    Statistica Newsletter Tipped You Off About Facebook, Didn’t It?

    You never know what you will learn from the helpful Statistica newsletter. (Subscribe for free.) Our recent June issue brought to our readers’ attention that legacy StatSoft’s social media properties have been changed up quite a bit. You won’t find us by the same name anywhere anymore!

    Well, that’s not entirely true, but readers did learn that our old Facebook page has now expanded to include all our fellow software teammates in Dell’s Information Management Group. So, now when you go visit our page, you will find our Statistica content mixed with that of Dell Boomi, SharePlex, and Toad. It’s like we suddenly discovered an extended family with whom we share much in common—primarily we share the successful end-to-end workflow of YOUR data. We think you should stop by and get to know these family members, too, and then “like” them the way you’ve always liked Statistica.

    Find all our new social media links in the June newsletter >

  • Information Management

    #ThinkChat IoT and AoT with @ShawnRog and @Tdav

    Integration, Analytics and Process are all part of the Internet of Things (IoT) value chain. Pulling it all together is much harder than it seems and will present a significant challenge to many companies looking to derive value from IoT initiatives. Successful companies will need to employ a strategy that leverages a flexible infrastructure that manages the data and the analytics at the edge and embedded into critical applications.                                                                                                 

    In this week’s #ThinkChat segment Tom and I discuss why the Analytics of Things are more important than the Internet of Things (IoT). Neither of us discounts the value of the infrastructure or the data but in the end actionable insights are what drive the ROI and it’s impossible to get there without the analytics.                                                                                                          

    #ThinkChat Conversation with Tom Davenport Part 4 of 7         


    To view other segments in the #ThinkChat series click here.       

  • Information Management

    What’s In A Name? Statistica Wasn’t Always Statistica

    The headliner in the latest Statistica e-newsletter was hard to miss, announcing the official release of Statistica 12.7. Thirty-one years in the making and our analytics platform just keeps getting better! There were no trumpets or parades, but that doesn't mean there is not some really cool stuff in there. I won’t go into details about 12.7 here —that's what the newsletter is for! (Yes, you can subscribe for free.)

    But this occasion reminds me that, although Statistica's parent company, StatSoft, was formed the same year as Dell, the product line was not always called Statistica. Rather, the company's first product release was delightfully named PsychoStat, designed mainly for natural and social scientists. It featured menu-driven libraries of flexible statistical procedures integrated with data management, and it could run on a mere 64K of memory! PsychoStat was followed by the successful introduction of a product called Statistical Supplement for Lotus 1-2-3, as well as another called StatFast that was designed for use on the newly introduced Apple Macintosh.

    By late 1987, StatSoft had expanded and integrated all its lines of software into one large statistical package called CSS (Complete Statistical System). CSS included prototypes of many of the unique input, output, and analysis control features that would later become trademarks of StatSoft’s software technology.

    As CSS (and MacSS for the Macintosh) became popular, StatSoft was already devoting all R&D resources to the development of a new, advanced line of statistics software: STATISTICA, which was to offer entirely new levels of functionality not available in any other data analysis software at the time.

    And yes, in case you were wondering, the trademarked product name was always presented in italicized capital letters, even in body copy.

    The first (DOS) version of STATISTICA was released in March 1991, followed by STATISTICA/Mac in January 1992. Finally, STATISTICA for Windows (aka STATISTICA 4.0) was pre-released in 1993, representing the crowning achievement of StatSoft’s R&D efforts. The platform’s graphics technology and numerical integration, the flawlessness of its user interface, and the capacity and speed of its computational modules all set new standards for numerical and graphical analysis software.

    The rest, as they say, is history. Subsequent releases of STATISTICA continued to address enterprise, connectivity, and scalability needs in the world economy, and the platform continued to set new performance, quality, capacity, and comprehensiveness standards for statistics, graphics, and analytic data management software. No wonder it eventually caught Dell’s attention.

    StatSoft had steadily worked the software up to version 12 before the Dell acquisition in March 2014, after which the Statistica name itself got a makeover: no more italicized caps. Now the latest release this year is Statistica 12.7. And next…? Stay tuned.


  • Direct2Dell

    Walking the Line Between Innovation and Icky: Big Data and Privacy

    These days there’s a lot of talk about big data and its effect on privacy. After all, we now work with vast amounts of data that wasn’t practical, accessible or simple to leverage in the recent past. It used to be true that companies only tapped into 20 percent of their data resources, leaving the remaining 80 percent because it was too costly and difficult to utilize fully.

    Not anymore. Today, innovative companies are striving to use all of their data (#AllData). Advances in data mining and big data analytics are enabling innovation at the speed of business. We can mash-up, manipulate and mine information to do great, new insightful things. We can take advantage of derived data, which leverages several points of data to create new data about just about everything, including buying patterns, consumer preferences, business directions—the list goes on and on.

    But before we get carried away with the endless possibilities, let’s remember a quote from Voltaire: “With great power comes great responsibility.” Companies that start down this path—and it’s a crowded one these days—must walk a fine line between innovation and icky.

    Most everyone appreciates when Amazon makes suggestions for additional purchases based on behavior data. In other scenarios, data that is derived can come as a complete surprise—such as when a retailer uses shopping basket analysis to determine that you have a cold or a baby on the way. When Amazon uses data about you, it feels innovative. When a retailer creates data about you, it feels downright icky.

    With the advent of big data technologies, there’s more and more data included in analytic work streams that simply wasn’t available before. Issues around privacy are very fluid right now. Common sense and best practices should prevail during conversations about where the boundaries of innovation and “ickiness” cross.

    According to a recent AP story on high-tech fitting rooms, some retailers are testing versions of technically advanced fitting rooms with “smart mirrors” to compete with online retailers. The technology enables a brick-and-mortar retailer to collect much of the same behavior data as online retailers and then use it to recommend other products. So, would you appreciate a mirror that suggests a pair of jeans to go with the red shirt you just tried on or is that an invasion of privacy?

    Consumer advocates already are voicing concerns about who ultimately has control over the data collected. Governments worldwide are starting to pass legislature and guidelines around digital privacy. It’s early, but the conversations need to continue so regulations can be developed to protect people from what they don’t fully understand.

    Recently, I bought new doorknobs for my kitchen cabinets and for weeks, I was inundated with online ads for doorknobs, even if I was visiting a sports, cooking or news website. Most people don’t know that major websites share data as part of behavioral ad targeting. I personally think it’s cool when Amazon suggests a book I might like based on a previous purchase. But, a sports site trying to sell me more doorknobs falls into the icky camp.

    That’s why it’s so important to understand both the context and circumstance of how data will be used. I spoke with an educator from a small school district on the east coast where analytics were being gathered on K-6 students. The goal was to data mine all available information on a student to identify the Key Performance Indictors (KPI) that would correlate to how likely he or she was to graduate high school  A fantastic utilization of data, isn’t it? But there also is a potential downside. How do you share it? Or should you share it? Do the parents deserve to know? Will the knowledge affect how teachers interact with students?

    According to an article in Time, a movement is stirring in about 125 schools around the country. Officials are sifting through years of grades from thousands of former students to predict what will happen to current classmates. One university uses data to determine which students would have a higher propensity of graduating while other schools have learned to minimize costs of recruiting new students who they believe are more at risk.

    While big data and analytics are incredible, there is a double-edged sword around proper use of this information. For example, there’s a teachers’ union that is working with its state to change the compensation policy to one that is more performance-based. While that would seem all well and good, what if a school gathers data that reveals which students will not do well and then teachers don’t want these students because they could negatively impact their compensation? Or what if students find out their predicted fate and it turns into a self-fulfilling prophecy?

    At Dell, we understand innovation comes with responsibility. We strive to keep our finger on the pulse of governance and privacy best practices so we don’t cross the boundaries from innovation to icky. Have any thoughts on walking this fine line? If so, drop me a line at

  • Information Management

    Technology and Healthcare - Advances in Patient Care

    What do the 43rd President of the United States and I have in common? We like hanging out with healthcare technology professionals!! President George W. Bush gave the closing keynote at this year’s Health Information and Management Systems Society (HIMSS) event in Chicago and Dell was one of the Corporate Sponsors. It’s been a few years since my last visit to HIMSS and I have to say I was extremely impressed. The event is attended by over 42,000 people and seems to cover every square foot of the McCormick Center. Dell had an extremely strong presence at the program, hosting a charity in the booth, a dozen different Dell HCLS solution demo's and 3 live tweetups.

    The technology themes were varied throughout the event and I was there to help lead a discussion on Population Health with Dell experts Dr. Gary Miner, Dr. Tom Hill and Dell's acting Chief Medical Officer Dr. Charlotte Hovet. We were also joined by Dr. Ken Yale, Vice President of Clinical Solutions, ActiveHealth Management. It’s interesting to see where data is playing a role in driving more consistent and higher quality patient care. Population health obviously benefits from data driven insights. Technology's like advanced analytics are helping us move beyond an understanding of large populations and to focus in on more personalized patient care via diverse data and insightful analytics. As we are able to leverage more data and a greater variety of information on specific patients the ability to personalize care and apply a customized level of best practices will result in much better overall patient care.

    L-R Shawn Rogers, Dr. Gary Miner, Dr. Tom Hill, Dr. Charlotte Hovet and Dr. Ken Yale

    The end result as advanced analytics drives patient care forward will be precision healthcare where care givers are able to execute specific regimes for each individual based on their specific needs, chemistry, DNA and other personalized markers and prerequisites. It's exciting to think that advanced analytics has the ability to enhance treatment and deliver personalized healthcare. Innovation isn't without its hurdles, connectivity to data and a new responsibility of patients to bring their own data into play will prove difficult. New trends will include device information on a patient’s exercise and activities, diet, location, travel history and more. Advanced analytic platforms will factor many new data points into models in order to achieve the highly specific care plans required by precision medical treatments. Look for care givers to push back a bit as the culture of human knowledge and instincts collides with automated and model driven best practices. I believe that ultimately both voices need to be heard in order to supply the best possible care. Dr. Hovet made the point that even though analytic platforms will supply a path for treatment Dr’s will still play a critical role in communicating,  implementing and executing precision medical treatment. The days of the robot doctor are still way out in our future. 

    Having been in the data business for as long as I have, I found the themes at HIMSS to be exciting and full of promise for future and immediate innovations based on our ability to leverage greater amounts of data and a wider more dynamic range of information in order to add value to overall patient care. These are exciting data driven days for health care. 

  • Information Management

    Analytics Here, There, And Underwear

    Perhaps some of you remember this successful advertising campaign of a bygone era: "The Maidenform Woman: You never know where she'll turn up." I was reminded of this while compiling the June issue of the Statistica e-newsletter.

    Those of you in the know are surely raising your eyebrows by now. That's because Maidenform was promoting women's undergarments, not analytics solutions. Nonetheless, the shared concept of ubiquity is where I found the Statistica connection.

    Specifically, I was prepping a lengthy list of events in EMEA and North America, everything from tradeshows and conferences to workshops and webcasts. But it was the EMEA events, both big and small, that struck me with their variety of non-analytic-sounding titles reflecting different industry verticals: HIMMS, Interop, Oil & Gas, Cloud World, IsisTech & Oxford AHSN eHealth. Of course, it helps to know what the acronyms stand for, but here's my point: Statistica's analytics solutions and data tech compatibility are applicable in just about every industry across the spectrum, so almost every business event out there is relevant.

    That is why Dell Software is sponsoring these events and many more. It is important that we meet decision makers where they are and help them envision the suitability of our solutions, even at functions that—at first glance—might not seem to be practical venues for exposure. Our calendar of events is certainly in keeping with our current mission to embed analytics everywhere, empower more people, and innovate faster. And as we continue to increase our reach through such varied opportunities, Statistica becomes like the Maidenform Woman: you never know where we'll turn up!

    Read (and subscribe to) the June newsletter >

  • Information Management

    Become an Analytics Superhero with Statistica 12.7

     Everyone loves superheroes. As if we need proof, Avengers: Age of Ultron has already raked in more than $1.1 billion. Perhaps it’s because we all love to fantasize about having superpowers. Who wouldn’t want super strength like Captain America? How about the ability to fly, so you could whip past traffic on your morning commute?


    Our daydreams could go on and on, but we suspect there’s one superpower that would help you right now: the ability to quickly and easily analyze the massive volumes of data your organization collects each day. Imagine if you had Hulk-like strength to smash down data barriers. You could effortlessly collect, integrate, analyze and use all that data. But without the right powers ― and with constant demands to help generate business insights ― your data battles may seem more daunting than a faceoff with Ultron himself.


    Well, that whole fantasy about ruling your data universe is about to become a reality. That’s because we’re delivering the power you need to easily turn data into actionable information. And we’re talking about virtually all data here: data coming from traditional on-premises sources as well as cloud-based data sources and modern data stores like Hadoop and NoSQL. So if you’re ready to get your analytic superpowers on, check out our latest version of Statistica, the advanced analytics platform.


    Two important enhancements in Statistica 12.7 will empower you to take data analytics to the next level. We partnered with Datawatch Corporation to boost the advanced analytic capabilities of Statistica with enhanced interactive visualization and dashboarding. Rich, visual representations of various data streams will help you easily identify opportunities and hidden patterns. This release also offers self-service data preparation and real-time streaming to put data analysis power in business users’ hands.


    In his recent article, “Dell Brings Advanced Visualization to Analytics Platform,” CIO’s Thor Olavsrud noted that the addition of these advanced interactive visualization tools and dashboard capabilities extend the applicability of Statistica to additional users, including business analysts.


    Statistica 12.7 also integrates with our Toad and Boomi product lines, delivering connectivity with more than 164 data sources, cloud or on-premises, in motion or at rest. Further development of the Statistica big data analytics module, enhanced text-mining capabilities, natural language processing and search tools, expand the product’s ability to derive insights from unstructured data. The coolest part is that the Statistica big data analytics module brings advanced analytics to the data, rather than the data to the math.


    We already have customers using the newly integrated Datawatch capabilities. Don your own analytical superpowers with a free trial of Statistica 12.7 today.


  • Information Management

    #ThinkChat Big Data Uses Cases and Pitfalls with @ShawnRog and @Tdav

    As quants become more critical to your overall analytic environment there will be growing pains between them and the line of business executives they serve. Many business sponsors see quants as alien beings, math magicians from another planet. Quants can and do deliver extremely useful insights but it remains the responsibility of the business leader to transition that insight into valuable action. Feeding the need of an executive to “look” informed is a recipe for disaster. Its critical for a business leader to stay focused on action not just the collection of information.

    In this week’s #ThinkChat segment Tom Davenport and I discuss the dynamics of quants and their business partners and touch on a few companies who are doing it right and getting high value from their investments in Big Data and Quants.

    #ThinkChat Conversation with Tom Davenport Part 2 of 7

    To view other segments in the #ThinkChat series click here


  • Direct2Dell

    The Internet of Things – Building Automation Analytics Solution in Action at DAAC

    At this year’s Dell Annual Analyst Conference (DAAC), Michael Dell was crystal clear about the immense opportunity the Internet of Things (IoT) represents. In fact, he called it the trillion dollar opportunity.

    Not surprisingly, IoT was one of the big trending topics also discussed by analysts, sourcing advisors, Dell partners and customers alike. It was reflected in keynote speeches, breakout sessions and in demos. The building automation demo by Dell OEM partner KMC Controls showcased how data taken from sensors in a building, aggregated via a gateway and then analyzed can help make a building more energy efficient and safer. Next to this demo, Dell Software showed how an advanced analytics solution built on top of the KMC Controls solution can provide comprehensive data integration and predictive analytics-based insights.

    The Internet of Things

    Let’s step back a bit. IoT is not new and although it is often framed as an emerging trend, it is no longer a prospect of the future. Many companies already have sensors on their equipment used for predictive maintenance.  The Internet of Things can be described as an ecosystem where sensors, devices and equipment are connected to a network, and can transmit and receive data for tracking, analyzing, and taking business actions. What does this data journey look like?

    The data journey

    The journey of the data begins at the sensor connected to a device, for example an air conditioning unit or refrigerator. Now the data has to travel via a wireless or connected medium to get to an aggregation point, either a gateway or a datacenter. Gateways, such as the just launched Dell IoT Gateway , are small, wireless or connected devices that collect, help secure and process sensor data at the edge of a network. They represent one way of collecting data and can be used as a smart device to perform real-time monitoring and analytics of streaming data.

    Next, the traveling sensor data has to be integrated into a much larger pool of data, including non-sensor data such as weather, social media, CRM, business or other machine-to-machine (M2M) data. A platform like Dell Boomi allows for seamless, real-time data integration and normalization on either the gateway, datacenter or in the cloud. At this level, Boomi also adds additional value in terms of ensuring industry data compliance (HIPAA, Safe Harbor, PCI, SOC II)

    The data value resides in analytics

    Nicely integrated, this data is now ready for analytics (Dell Statistica).  This video explains the value proposition.

    In short, here is how it works: There are two ways to analyze the data: 1. perform real-time analytics on streaming data (time series, see examples 1 and 2 below) at the edge (gateway) and 2. perform deeper analytics on the historic data set that can be done in the datacenter to help predict maintenance events or forecast business trends.

    An example of streaming data analytics is shown in figures 1 and 2. Figure 1 (follow link) shows the monitoring of refrigerator compressor temperature against ambient temperature in real time, setting off alerts to the building operator in case of anomalies, such as an unhealthy compressor.

    Figure 2 (also shown above) shows real-time operational analytics mashing coffee pricing data (milk, paper cup, coffee grounds, etc.) with sensor data from inside the manufacturing facility to monitor the overall health of the manufacturing facility.

    Figure 3 (follow link) provides an example of predictive analytics performed on data at rest. This example shows how we can use multivariate process monitoring and control statistics to predict machine health. The analytics show that there is a definite correlation between the speed of the cup movement by the robotic arm inside the coffee machine and the ice dispensing speed which points to the need for maintenance of the robotic arm.


    KMC Controls, via its building automation solution, controls and monitors the devices and sensors that are installed in a building. There is an opportunity to develop a centralized view of the entire building to provide a comprehensive IT management solution which includes IoT devices and sensors as well as all IT assets that are managed inside the building.


    Many devices work on their own secure network, and building automation manufacturers like KMC Controls need to consider what security measures must be implemented for detecting and blocking malicious activity over non-standard network protocols. A natural starting point is to consider extending the existing firewall technologies to comprehend these new devices.

    Learn more about IoT solutions

    To assist companies in overcoming the hurdles in setting up industrial IoT technologies, Dell has set up the Dell Internet of Things (IoT) Lab. Companies can come to the lab - located in the Dell Silicon Valley Solution Center in Santa Clara, CA - to receive assistance in architecting solutions to their IoT needs.

    To learn more about the Dell IoT Lab, please see To learn more about the software solutions Dell offers in this space, please visit

  • Direct2Dell

    The Future of Analytics in Healthcare: Part Two

    Dr. Charlotte HovetQ&A with Dr. Charlotte Hovet, Dell’s Medical Director of Global Healthcare Solutions

    In this second part of our interview series with Dr. Charlotte Hovet, medical director of Global Healthcare Solutions at Dell, we examine what healthcare will look like in 2020 and offer tips for getting started.

    What will healthcare look like in 2020?

    The world of healthcare will look different in five years and significantly different in 10 years as providers and patients adapt to disruptive change.  Technology, consumerism and new payment models are reshaping the delivery of healthcare, and as a result, we can anticipate better care, better health and lower costs. However, that doesn’t mean there won’t be significant stumbling blocks along the way.

    I’ve traversed the globe for nearly a decade advocating change and while physicians are quick to adopt new medical devices, they’ve been slow to embrace Electronic Health Records (EHR). And, often for good reasons. New technologies must enhance the effectiveness and efficiency of clinical practice and align with people, process and policy changes. As such, challenges will linger over the next five years as phase two of the EHR continues and progress is made on the integration, interoperability and security fronts.

    Full adoption of patient-centered tools will take time and patience as well as assistance to overcome steep learning curves. Moving to a digital world is certainly disruptive but the upside is tremendous—a world of true clinical collaboration and innovation. The ability to deliver integrated services will spawn new care-giving models with expanded scopes and teams outside the traditional clinic and hospital settings.

    New technologies like telemedicine will emerge that enable people to have care on a daily basis where they need it most—in their homes. Care teams will be able to reach out to patients in remote areas with a focus on prevention and continuity of care. Whether a person has an acute problem or a chronic disease, the capacity for home care will be greatly enhanced. It will be an exciting time in healthcare as we experience value-based, rather than volume-based, service delivery.

    Would you say that value-based healthcare is data-driven healthcare?

    Absolutely! Analytics and informatics will be the primary drivers in this newly expanded healthcare view. The knowledge we derive from data changes everything—how we interact with patients and how we diagnose and treat them. Let’s look at University of Iowa Hospitals and Clinics where Dr. John Cromwell is using Dell Statistica to better predict which patients face surgery risks and then determine which medications or wound treatments would be most effective in reducing their chances of acquiring a hospital-acquired infection.

    How will predictive analytics be useful in lowering healthcare costs?

    Instead of focusing most of our efforts on the high-cost, high-risk group, which currently accounts for three-quarters of our healthcare spending, predictive analytics will enable organizations to focus on the rising risk—the middle group—which is often ignored. If we can identify those people who are at risk for chronic disease and actively intervene before they become high risk, we can make major headway in lowering the cost of healthcare delivery while dramatically improving quality.

    What’s the best way to get started?

    Healthcare transformation requires alignment of people, processes and technology, which we discussed in a recent webinar. We recommend starting with a readiness assessment to reveal where you do—and don’t—have alignment across the organization.

    This can be determined by asking basic questions, such as: What clinical analytics do you need and who holds the key to that information? Who on your staff will mine the data and look for trends? What will be done with that information to change the delivery of healthcare services? What role does governance play in all of this? And, what steps need to be taken once all this knowledge is passed along to the appropriate clinical improvement teams? How will they collaborate to identify trends, turn insights into action and change care delivery?

    This iterative process needs to involve the physicians and nurses who are directly involved in delivering care. Future healthcare will be highly collaborative and empower healthcare professionals to identify best practices through analytics as well as how they can use this information to improve decision making and patient care outcomes.

    How is Dell helping customers accelerate healthcare transformations?

    Dell does a great job of guiding customers along their data-driven journeys by bringing together hardware, software and services to address their needs today while providing an IT platform for the future. Today, we’re proud to be working with some of the leading healthcare organizations, including Dignity Health, HealthMarkets, Beth Israel, Boston Medical Center and more.

    In my travels, I meet with chief analytics officers, chief technology officers and chief medical information officers. I tell them about a population analytics project with a hospital in the south. I share highlights of a recent pilot using advanced predictive analytics to identify those at risk for exasperation of asthma or diabetes and the impact on hospital readmissions. I explain how the University of Iowa Hospitals and Clinics has lowered surgery infection risks and subsequently surgery costs.

    It’s exciting to share insights, ideas and engage others in healthcare transformation today, so we all can benefit from a new world of healthcare delivery by 2020.


    What do you think healthcare will look like by 2020? Email me at to offer your forecast.


  • Statistica

    Thought Leaders in the Mix - June 2015

    Our subject matter experts for Statistica and the Information Management Group (IMG) keep busy, staying abreast of current trends with big and small data, predictive software and real-world analytics solutions. And they frequently comment in industry publications, professional forums and blogs. Here are a few of their recent articles.
    Dell Information Management Group John Whittaker Analytics in Healthcare: Q&A with Dr. Charlotte Hovet, Part 1
    by John Whittaker, executive director of product marketing

    In this interview, Dr. Charlotte Hovet, medical director of Dell’s Global Healthcare Solutions, shares her thoughts about how healthcare informatics and predictive analytics are helping to usher in a new era of wellness and disease prevention.



    Dell Software Group Shree Dandekar Connected Cows and the Evolution of Agriculture IoT
    by Shree Dandekar, executive director of product management (analytics)

    The benefits of agriculture IoT sound enticing, but there are architectural challenges to be addressed before deciding on a solution that turns 6000 head of cattle into a data powerhouse. Dandekar describes Dell's successful case at Chitale Dairy.



    Dell Information Management Group Joanna Schloss
    Why companies can't afford to go overboard with analytics
    by Joanna Schloss, business intelligence and analytics evangelist

    While advanced analytics is a critical component to the success of an organization, Schloss outlines in her latest CMSWire article the drawbacks of excessive analysis and the benefits of focusing on innovation rather than optimization.




  • Direct2Dell

    Getting Smarter About Collective Intelligence

    Lately, whenever I hold a whiteboard discussion on collective intelligence, customers, prospects, analysts and even my fellow Dell team members give me their full attention. Now, people have talked about collective intelligence for ages, but I think what drives the point home now more than ever before is the success we’re seeing with crowdfunding and crowdsourcing.

    Three people in an office meeting room look at Dell laptops

    There’s an important lesson from crowdsourcing that data scientists need to learn: there’s greater value in your information if you can share it more readily with more people. The collective intelligence you can gather will be far richer than if it had stayed within the confines of the corporate walls.

    Let’s face it, intelligence is not evenly distributed in this world. But there are lots of folks who are very good at building models and want to make them available for the greater good. By tapping into this shared, group intelligence, companies of all kinds can make better business decisions. Perhaps that’s why I get similar levels of excitement whether I’m talking to a Silicon Valley start-up, healthcare organization, energy company or high-tech manufacturer.

    After all, why rely on four or five data scientists in your own organization when you can turn to data scientists around the world for insight and perspective? This is the winning approach taken by Apervita, a leading health analytics community and soon-to-be Dell partner, which empowers health professionals and enterprises to capture and share health knowledge. They’re smart about facilitating collective intelligence by simplifying how people author, publish and use health analytics, including algorithms, quality and safety measures, pathways and protocols.

    In February, the Mayo Clinic joined the Apervita community to share its extensive portfolio of algorithms covering specialties, such as cardiovascular, pulmonology and oncology. The goal: To make it easy for physicians to sift through all the Mayo Clinic’s cardiovascular data, for instance, so they can automatically identify patients at risk for sudden cardiac arrest, which is the leading cause of death among adults over the age of 40.

    In May, the Cleveland Clinic joined the Apervita community to share its advanced prediction models and wealth of medical knowledge with a broader audience. By liberating all this data and putting the collective knowledge to work, these organizations and Apervita are making it much easier for health researchers and practitioners worldwide to have a positive global impact on health.

    The beauty of Apervita’s cloud-based approach is in the simplicity and openness of its platform, which enables anyone, anywhere to create and subscribe to analytics and then easily integrate them into their workflows. This is the same approach taken by Algorithmia, which launched in 2013 with the goal of advancing the art of algorithm development, discovery and use.

    Dublin, Ireland-based ExpertModels is another innovative group with an open, online platform for sharing data insights as well as building, requesting or marketing data sets, analytical models and data science expertise. The openness of these data markets and communities is what makes them an ideal conduit for collective intelligence. That’s also what truly differentiates Dell Statistica because the sheer openness of its architecture makes it possible to blend the best of these models through a common repository.

    By opening our platform to other environments, Dell has empowered organizations to take models from Algorithmia, Azure ML, ExpertModels, etc., and knit them together in new workflows to increase interaction and collaboration. It’s a great example of how we’re helping customers get smarter about collective intelligence—and we’re the only company that can deliver this level of integration.

    It’s one of the reasons Borden Chemical initially chose Statistica as an analysis platform at over 30 sites worldwide. A leading supplier of high-performance resins, adhesives, coatings and basic chemicals to a broad range of industries and thousands of end-use applications, Borden integrated Statistica with its SAP and Laboratory Information Management System (LIMS). By taking advantage of Statistica’s open, distributed architecture, the company easily empowered more than 150 researchers, quality control engineers and technical consultants to combine their collective intelligence worldwide to simplify complex data analyses and reporting.

    I’m confident this collective intelligence message will continue to resonate, especially as we share more examples of all the amazing things we can accomplish with distributed intelligence. After all, we’ve always known that “two heads are better than one,” so think of what can be done when you amplify that with hundreds of thousands of smart people and interactive models.

    How can you increase your company’s collective intelligence? Drop me a line at with ideas on how to get smarter about collective intelligence.

  • Information Management

    You Now Have More Access To Advanced Analytics, But At What Cost?

    Big data has made big strides in recent years. Specifically, more organizations than ever before are leveraging data and information to deliver actionable and valuable business insights. While big data problems of the past have centered on making sure infrastructure could keep up with how much data is being pulled, significant advancements in storage and other infrastructure technologies have given us a firm foundation on which companies can deploy their predictive models.

    Thus far, 2015 has provided new opportunities to bring analytics directly to business users, but with it, challenges now go beyond what’s in the datacenter alone. These opportunities and challenges have already begun to present themselves and organizations are learning to address them in the following ways:

    Opportunity: Enterprises are using existing technology with big data platforms to deliver ROI

    While the analytics space has historically been crowded with BI, dashboarding and other tools, more enterprises have begun to use new platforms with existing analytics programs to unlock business value. To begin with, enterprises are looking for ways to incorporate data visualization with data analytics solutions to more easily interpret vast amounts of unstructured data. While the interpretation challenges still remain, those who apply visualization solutions map out meaningful insights everyone from non-technical executives to data scientists can read more effortlessly.

    One of the ROI-achieving byproducts of visualization and analytics is that insights now become more accessible to a wider user base. With BI vendor offerings becoming increasingly easy to operate, business-minded users who might not have the background to use traditional systems are finally able to leverage data analytics to create new revenue streams. In doing so, they’re able to deliver better customer experiences and expand into new markets.

    Challenge: Self-service and automated decision-making are influencing businesses to reorganize 

    While the demand for candidates skilled at interpreting data still surpasses the supply, companies are coping with this shortage by investing in self-service, automation and augmentation platforms. Ultimately, organizations are leveraging automated decision-making and data discovery tools for improved cost and efficiency. At the same time, they must be prepared to significantly restructure to achieve competitive advantages like using data to proactively cross-sell and up-sell. Many operational processes now can be completely executed automatically with data analytics. When adopting programs that automatically push successful predictive models straight to the data, organizations should spend time checking the source to ensure the usefulness and relevance of tried-and-true models. While automation can enable real-time analytics, resources still should be allocated toward making sure current models are the best ones to use.  

    Opportunity: The growth of Information as a Service (IaaS) is providing easier access to analytics

    There is a steady movement from simple, backward-looking descriptive analytics to advanced analytics that predict outcomes, and prescribe a course of action. This creates more opportunities to democratize access to analytics. One option that is emerging as a result of both this movement and the rising popularity of “as-a-service” delivery models is “information-as-a-service” or IaaS. The availability of IaaS further breaks the barrier to entry for businesses that historically have not had the technology, finances or skills to leverage advanced analytics, as well as provide them with an additional competitive edge to bolster growth.

    Challenge: You’ll find network and security challenges at the intersection of big data and IoT

    The growth of connected endpoints is making more information available for extracting insight. This, in and of itself, has driven both the growth of IoT, as well as the need for analytics. With more data, however, there is more exposure to such vulnerabilities as cybersecurity threats, compliance issues and other risks. Although this creates a market opportunity for vendors offering integrated solutions covering a comprehensive list of data analytics, endpoint management, threat detection and compliance needs, the reality for IoT organizations is that there is already a struggle not only to mine new data pools, but also to securely store the data.

    Organizations continue to have an opportunity to benefit from advanced analytics, as access to data only gets easier and making sense of it simultaneously grows less complex. This provides opportunity for people outside the data scientist profile – from business users who need analytics to solve a marketing problem, to small businesses that, yesterday, couldn’t afford to invest in the time or costs associated with pulling insights from their data. While other challenges have emerged and will continue to do so, improved accessibility has opened a huge window of opportunity that help businesses use their data to spike competitive advantage.

  • Information Management

    Shire Achieves Success

    With the headline, "Get inspired by the life-changing benefits of Dell Statistica," we linked to a new case study in the April/May issue of the Statistica Monthly News. This one is all about the rare diseases unit of Shire, a global biopharmaceutical company that seeks to ensure the robust and uninterrupted supply of quality medication to its patients. The implementation of Dell Statistica has helped them conduct statistical process control, monitor processes and identify areas for improvement.

    Anything that reduces a multi-day analysis down to mere minutes without sacrificing quality has got to be good, right?

    Given the nature of Shire’s business and the kind of help they offer to people all over the world, this is a very inspiring story, and Dell Statistica is proud to be part of it.

    Read more in the April/May issue of Statistica Monthly News >  

  • Information Management

    The New Statistica: Datawatch Visualization, Hadoop Connectivity and Access to Even More Data

    How do you empower more people with advanced analytics for greater impact on how they do their job?  How do you embed analytics everywhere so you can make data-driven decisions?  How can you use analytics to innovate faster?

    These questions keep us up at night, just as they keep a lot of our customers up at night. We don’t have a silver bullet for them yet, but we’re moving Statistica closer to being one, little by little.

    Favorable analyst reports, new release of Statistica

    The press and analysts are starting to notice. They’re talking to you and finding out that Statistica is meeting your needs and then some. Many Statistica customers are commenting favorably on the ease of use, completeness of solution, integration efforts with other Dell products, and ongoing investment by Dell in the product and people since the acquisition last year.

    That’s encouraging news as we get ready to launch this quarter’s Statistica version 12.7, with features aimed at helping you get advanced analytics into the hands of more of your users:

    • So the 16,000 analytical options in Statistica weren’t enough for you? We’re not surprised. That’s why a few weeks ago we began taking the product in a new direction through a technology partnership with Datawatch. In the next version of Statistica, your users will have access to rich, interactive visualizations and dashboards. These visualizations will allow you to drill down and see patterns and opportunities in your data. What’s really cool is that you’ll be able to visualize real-time data streams in Statistica and share the results with even more people in your organization.
    • Big data is everywhere, so we’re doubling down on our big data analytics module — we used to call it Kitenga — with enhanced connectivity to Hadoop HDFS. We’re going beyond simply connecting to Hadoop: instead of bringing the data to the math, the Big Data Analytics module allows you to bring the analytics to the data. That will eliminate performance bottlenecks and handle ever-increasing amounts of data more smoothly. Furthermore, the module also enhances our text analytics capability with natural language processing (NLP) and entity extraction.
    • Statistica is settling in just fine with other Dell products. So much so that this release includes connectivity to both unstructured and structured data sources in Toad Data Point, Toad Intelligence Central and Boomi. That means Statistica users can aggregate and analyze data from more than 160 sources without the need for constant IT support. With these new integrations, your data can be in the cloud or on premise, moving or at rest.
    • For those of you digging deeper into R, Statistica continues its long-standing support and integration with the language. Statistica’s governance and control will let you deploy R models more effectively.

    We plan quarterly releases to Statistica from now on. If you already use Statistica, keep an eye out for upgrade instructions. And if you’re tired of staying up at night figuring out how to get analytics into the hands of more of your users, keep an eye out for the free trial version to download.

  • Information Management

    The “Internet of Things” Is Already Here

    What do your smartphone and your office air conditioning system have in common? Surprise—they top the list of IoT data sources.

    In the May issue of the Statistica Monthly News, we shared an item about a new infographic that summarizes the results of a survey Dell commissioned from Enterprise Management Associates (EMA). It’s all about the Internet of Things (IoT) marketplace.

    When it comes to the IoT, I tend to think of it as something that is still theoretical, probably because I didn’t wake up this morning to learn that Skynet was suddenly in charge of everything. But the build-up of the IoT is in full swing and has been for some time with machine-to-machine (M2M) sensors and real-time automation. Perhaps it is not at critical mass yet, but EMA’s infographic identifies some of the major industries pushing that swing. Review the infographic and see what other interesting information EMA uncovered.

  • Direct2Dell

    Analytics in Healthcare: Q&A with Dr. Charlotte Hovet, Part 1

    Headshot of Dr Charlotte HovetIn this two-part series, Dr. Charlotte Hovet, medical director of Global Healthcare Solutions at Dell, shares her thoughts about how healthcare informatics and predictive analytics are helping to usher in a new era of wellness and disease prevention. Part one takes a look at healthcare informatics changes happening today while part two looks ahead to what we should expect by 2020.

    Q: How do you define healthcare informatics?

    Last month at HIMSS’15, I spent considerable time speaking with attendees and Dell customers about health informatics. The buzzword “informatics” is often seen as the intersection between computers and medicine. But that’s not it. Informatics is a science; it’s the study and practice of creating, storing, finding, manipulating and sharing information. Informatics drives innovation, and when applied to healthcare, it enables us to effectively use information and knowledge to improve the quality, safety and cost of patient care.

    I spent more than 20 years as a practicing family physician, and that clinical experience shapes how I view patient-centered, information-driven healthcare. Additionally, I’ve spent nearly a decade advising others on the transformation of healthcare and the role of information technology.

    Q: What role does predictive analytics play in accelerating this healthcare transformation?

    Analytics is becoming critical to healthcare. The wealth of digital data available to healthcare providers is expanding exponentially, and having the ability to share, integrate and analyze data opens the door to population health management, disease prediction and personalized medicine.

    Because medical records now are digitalized, we have the ability to share data and use it to improve decision-making. Using data to impact clinical decisions is key to transforming healthcare. With analytics, we can connect disparate and different types of information to identify trends. From those trends, we gain valuable insights, and these insights should be used to change behavior. The only way we can significantly change patient-care outcomes is by changing the way care is delivered by clinicians and the way patients manage their health.

     Q: How does information-driven healthcare change the doctor-patient relationship?

    With information-driven healthcare, we can move from episodic care, where patients present with a particular problem, to new models of care delivery designed to optimize the health and wellbeing of the populations we serve.

    This will transform today’s fragmented, volume-based healthcare delivery model into a mobile, interconnected, value-based, collaborative care delivery model. The status quo is being disrupted by innovative digital technologies and the ability to align patient wellness with physician payment incentives. The result is the emergence of new tools and new healthcare capabilities that will lead to more personal and precise medicine.

    Photo of a doctor at a patient's bedside using Dell technology

    Q: What impact will new wearables technologies and smart sensors have on doctor-patient interactions?

    We’re in the early stages of wearable devices; however, it’s clear that wearables and other technologies enabled by the Internet of Things will play an important role in empowering patients to use information and take greater responsibility for their health. This happens through the ability to access lab results from your smartphone or using telemedicine to expedite a diagnosis. All of this data—whether it’s directed to the patient or the provider—leads to more informed decisions.

    In the future, we’ll see greater opportunities for shared decision-making as both patients and their physicians will have direct access to information that can make a difference. Wearable sensors will do more than remind you to walk more. They’ll trigger automated functions that can be incorporated into your lifestyle to increase prevention and wellbeing.

    Wearable technology will be able to combine personal and social media data with sensor data to reveal useful insights into a person’s lifestyle, such as sleep patterns and exercise levels. All of this data will be fed into cloud-based solutions where a variety of predictive analytics can be performed to deliver more personalized medicine.

    Q: How is Dell driving the delivery of patient-centered, information-driven healthcare?

    For the past two decades, Dell has helped healthcare organizations keep pace with dramatic industry changes with end-to-end, future-ready IT platforms that drive healthcare innovation. Our leadership in the healthcare IT space was reinforced recently by Gartner, which ranked us the No. 1 IT Services Vendor in the Healthcare Providers Category for the sixth consecutive year.

    This is a great accomplishment, and no coincidence, as we’ve focused on hardware, software and services to help our healthcare customers optimize their clinical information systems and be better prepared for what lies ahead. With our cloud and analytics expertise, they’re embracing more patient-engaging strategies while moving forward with clinical analytics.

    In the next five years, the adoption curve will be very rapid for new healthcare IT technologies that provide sustainable benefits. Predictive analytics and informatics will still be at the epicenter, delivering the insight and knowledge required for healthcare transformation.

    We’ll continue the discussion in part two of our conversation with Dr. Hovet. Till then, drop me a line at to share your thoughts on the future of analytics in healthcare.