Blog - Post List
  • Information Management

    The Timing Belt in our Great Analytics Migration [New E-book]

           Photo Credit: Ernesto Andrade Licensed under CC BY 2.0

    Imagine that you’re in the middle of an analytics migration project (as Dell was).

    Hundreds of users and projects all over the world are in transition from your legacy analytics product to the new one (as ours were).

    Everyone is heads-down-focused on following vast, detailed project plans with a jillion moving parts (as we were.)

    Suddenly, out of nowhere, an opportunity to fix Something Else swoops onto the scene (as it did onto ours).

    That Something Else is kind of a mess, and this would be the ideal time to fix it, but everybody around you is urging you to focus, focus, focus on migrating projects and users. The Something Else has to do with the tools and processes on either side of the analytics function. It’s not exactly the same issue as replacing your company’s advanced analytics product, but it’s closely related.

    What do you do: stay focused on your original project or devote some cycles to dealing with the Something Else?

    ETL, data extraction and reporting. And the timing belt.

    At Dell, we were waist-deep in migration from a well-known analytics product we had used for decades (you can probably guess which one) to Statistica, a product we had recently acquired. As I posted a few weeks ago, our migration project team discovered that a lot of people were using a Ferrari to haul dirt – that is, using a powerful analytics tool just for data manipulation – so we made some organization and tool changes as part of the migration.

    But the Something Else we discovered was that people were using dozens of tools for some of the main functions around analytics:

    ETL (Extract Transform Load) Process Automation – Microsoft SQL Server Integration Services, Microsoft Visual Studio

    Data Extraction – Microsoft SQL Server, D3’s JavaScript library, Adobe Site Catalyst

    Reporting – Microsoft SQL Server Reporting Services, Microsoft Access

    We had the opportunity to consolidate or replace these and stop the tool-creep, and it seemed as though we’d never have a better time to do it. Once everyone saw the inefficiency, the whole migration team wanted to deal with it, but it wasn’t part of the original plan.

    It was like the timing belt story:

    “Well, ma’am, your car has 90,000 miles, so we should replace the timing belt. And while we’re in there, we’ll have everything apart, so if there’s a problem with your water pump or the tensioner or the front cover gasket or the seal, that’s the best time to take care of it.”

    It’s tough to bite that bullet and deal with the Something Else. But you know if you don’t deal with it and you have to go back in again later to fix it, you’ll kick yourself.

    Actually, you won’t have to, because your boss will do the kicking for you.

    The Great Analytics Migration – new e-book

    So what did we do at Dell?

    We went the extra mile and did the consolidation. It’s the kind of company we are: we can’t look at an inefficiency and not do something about it. Our companywide Business Intelligence Council ran a survey that found dozens of tools at work. The council identified seven of the top ten of those additional tools for migration to appropriate Dell technologies. We’ll get to the rest of them eventually.

    Should you migrate users from other tools in the same project? We can’t tell you how to make that decision for your company, but we’ve put together an e-book, “Statistica: The Great Analytics Migration , Part 3: Technology, that tells you how we made it at Dell. Read the e-book for unique insights into how we managed our migration. We know quite a bit about it.

    Just don’t ask us about your timing belt.


    David Sweenor

    About David Sweenor

    From requirements to coding, reporting to analytics, I enjoy leading change and challenging the status quo. Over 15 years of experience spanning the analytics spectrum including semiconductor yield characterization, enterprise data warehousing, reporting/analytics, IT program management, as well as product marketing and competitive intelligence. Currently leading Analytics Product Marketing for the Dell Software Group.

    View all posts by David Sweenor | Twitter

  • Information Management

    Hippocrates Would Have Liked PAW-Boston Chowder

    While I certainly appreciate Boston for its history, chowder, and marathon, it is the predictive analytics scene that keeps bringing us back year after year. I know that sounds odd, but the annual Predictive Analytics World (PAW) Boston event is a natural fit for Statistica, especially with the recent development of a predictive healthcare track. 


    Healthcare's connection to predictive analytics arguably extends back to the ancient Greek physician, Hippocrates of Kos, who supposedly provided the instruction, “Declare the past, diagnose the present, foretell the future.” And if that isn't data-scientist-speak, I don't know what is! Hippocrates also touted the medicinal value of food, so I have no doubt he would have prescribed Boston clam chowder for its palliative effects, though I suspect he had his fill of seafood in his time (several hundred years before the birth of Christ).

    Back then, of course, the healthcare system--if it could be called such--was comparatively simple, perhaps limited primarily to individual doctor-patient relationships. That simplicity is no longer the norm. During Statistica's mere 31-year legacy, our customers have driven us to develop expertise that guides healthcare organizations through the necessary components of data management and reporting, patient analytics, insurance risk reduction and regulatory compliance. You can learn about some of our healthcare successes in our datasheets, white papers, and videos

    So, when it comes to targeted events like PAW-Healthcare in Boston, we get to be all over the place. Our newsletter readers (yes, you can subscribe for FREE) already received a short list of our PAW-Healthcare exposure, where we will be sharing our expertise face-to-face with modern-day physicians and data scientists at breakfasts, meetups, and presentations. Take a look here and then be sure to register for PAW-Healthcare yourself.

    We will also maintain a presence at booth #240, so we hope to see you there the week of September 28.

  • Information Management

    Onboarding Users and our Great Analytics Migration [New E-book]

    You can lead a horse to water, but you can’t make it drink.

    Image credit: Greg Westfall | Licensed under: CC BY 2.0

    If you’re going to spend months putting that water in place by migrating to a new analytics platform, you’d better build a process for onboarding users smoothly so that they drink. Otherwise, you’ll end up with a lot of people looking like the kid in the photo and reciting the caption to you.

    I mentioned in my previous post the migration project we underwent here at Dell to move off one of the world’s best-known legacy analytics products and onto Statistica, an analytics platform Dell had recently acquired. How do you onboard users throughout the project when you make a fundamental switch like that?

    Where does your user onboarding process live?

    Who manages user onboarding in your organization? Usually, the onboarding process lives in IT, which is where it used to reside at Dell. It wasn’t perfect, but we lived with it that way for a long time, along with three burdensome restrictions:

    Finite number of licenses: It’s hard to onboard new users when you have a limit on licenses for your analytics software. We had to ask IT for more licenses, and they had to tell us none were available.

    License swapping: De-activating and re-activating licenses to move them between data analysts was a drag, but as keeper of the software keys, IT had to be involved.

    Doling out licenses carefully: On the rare occasions when licenses were freed up, people had to lobby IT for access to them.

    In fact, it took the migration project to finally break this process, and that’s when we moved it out of IT.

    The flexible licensing model of Statistica allowed us to change the focus of onboarding users from IT to self-service in the business units themselves. Then, we made an organizational change so that the Business Intelligence Center of Excellence in each business unit made and managed its own strategy for allocating access to Statistica.

    That pushed the unwelcome variable (IT’s response time) out of the migration project and kicked off more-efficient onboarding for everyone. Internal customer satisfaction went up when users saw that getting access would be easier in the future than it had been with the legacy product.

    The Great Analytics Migration – new e-book

    We’ve written a new e-book called “Statistica: The Great Analytics Migration, Part 3: Technology.” Read it for an idea of how we at Dell handled the migration from one of the world’s best-known analytics products (you can probably guess which one) onto Statistica while allaying users’ concerns about migration and access.

    If you embark on a migration project, whether for analytics or any other companywide function, you’ll need to think about making the user onboarding process palatable.

    After all, the kid in the photo is cute for a minute or two, but you don’t want your co-workers looking at you like that for months on end.

  • Information Management

    Data Manipulation and our Great Analytics Migration [New E-book]

    “Why would you use a Ferrari to haul a load of dirt?”

    Yeah. Why would you?

         Photo Credit:  Falk Lademann

    You wouldn’t, of course, at least not knowingly. But a few months into the Great Analytics Migration I described last month, our migration team found analytics users at Dell who had been doing the equivalent for years. That’s when the question about using a Ferrari as a dump truck started making the rounds.

    Better-fitting tools for data management and manipulation

    The “Ferrari” was a well-known legacy analytics software product designed to run on mainframes back in the 1970s. (You can probably guess which one it is).

    It happens that the product includes tools for data management and data manipulation, so our users became accustomed to using the Ferrari, with its high licensing fees and remarkable analytics capabilities, for “hauling loads of dirt;” that is, manipulating data before analyzing it.

    In all fairness, most of the users were just doing what they’d learned from other users within Dell. And they weren’t ruining the Ferrari’s transmission or even scratching the paint. But as an enterprise software company, we have a line of products like Toad Data Point that cost less and are perfectly suited to the task of accessing and working with big data sources. And anyway, the entire migration project was about moving off the well-known legacy analytics software product and replacing it with Statistica, an easier–to-use analytics platform that we had acquired.

    So using a Ferrari to haul a load of dirt was costing us licenses that didn’t need to be tied up on data management and data manipulation tasks. It’s an extremely expensive way to perform relatively common functions.

    Better yet, as we separated analytics from data management at the software level, we also separated them at the organizational level. In the course of our migration project, we moved data management and manipulation to Toad Data Point, handled by data integration experts, and analytics and modeling to Statistica, handled by analytics professionals. That has put each team is in its respective wheelhouse.

    The Great Analytics Migration – new e-book

    Are you by any chance using an well-known legacy analytics software product to manipulate your data? If so, then some of your users are probably using a Ferrari to haul dirt.

    That may be all right with you, but if it isn’t, have a look at “Statistica: The Great Analytics Migration , Part 3: Technology” to find out how we switched analytics platforms worldwide in a matter of months.

  • Information Management

    Coming Out on Top – Toad and SharePlex Take DBTA Readers’ Choice Awards

    Thanks for making Toad and SharePlex #1 again! In the 2015 Readers’ Choice Awards sponsored by Database Trends and Applications Magazine, Toad Development Suite was named Best Database Development Solution, Toad DBA Suite was named Best Database Administration Solution and SharePlex was named Best Streaming Data Solution.

    More than 30,000 data professionals read DBTA, and that distinguished following recognized Dell products as winners in three categories and finalists in nine more of the 29 total categories. Having real-world users award us winner or finalist in almost half of all categories is quite an honor, and we’re gratified that you think so highly of our work.

    Organizations are becoming increasingly data-driven and look upon data as a competitive advantage. It’s no surprise that so many pros rely on Toad to help them meet their SLAs by being more productive, improving performance and ensuring high-quality applications can be delivered faster in their RDBMS and NoSQL environments.

    SharePlex got the nod from DBTA for extending the value of existing systems by integrating with modern systems, making data available to users in real-time and enabling active reporting and fast decision making.

    Dell’s Winners and Finalists in DBTA Readers’ Choice Awards

    Read all about the Readers’ Choice Awards, including judging criteria and category details. Here is a list of the Dell products that were named winners and finalists in this year’s awards:


    Best Database Administration Solution – Toad DBA Suite

    Best Database Development Solution – Toad Development Suite

    Best Streaming Data Solution – SharePlex


    Best Database Backup Solution – LiteSpeed for SQL Server

    Best Change Data Capture Solution – SharePlex

    Best Data Modeling Solution – Toad Data Modeler

    Best Database Performance Solution – Spotlight on SQL Server Enterprise

    Best Business Intelligence Solution – Toad Business Intelligence Suite

    Best Data Mining Solution – Statistica

    Best Cloud Integration Solution – Boomi

    Best Query and Reporting Solution – Toad Data Point

    Best Data Storage Solution – Compellent

    That’s 11 unique Dell products chosen out of 367 products nominated from dozens of vendors. These awards appear also in the August 2015 print edition of Database Trends and Applications Magazine, which has 20,000 subscribers.

    Download your free trial version

    If you’re not yet using one of the winning products, see what all the fuss is about. Click on the following links for a free trial version:

    Toad DBA Suite for Oracle

    Toad Development Suite for Oracle


    Thanks again for making us number one! We have lots more on the roadmap for all of these products, so keep your eye on us for next year’s awards.

  • Statistica

    Thought Leaders in the Mix - Sept 2015

    Our subject matter experts for Statistica and the Information Management Group (IMG) keep busy, staying abreast of current trends with big data and small data, predictive software and real-world analytics solutions. And they frequently comment in industry publications, professional forums and blogs—or produce videos, in Shawn Rogers' case. Here are a few of their recent articles.
    Dell Software Group Thomas Hill Automated analytics can fill in for data scientists: But...(!)
    by Dr. Thomas Hill, executive director analytics, Dell Software

    One way to look at how predictive modeling technology will transform the healthcare sector is to compare it to other industries that were the earliest adopters—and automaters—of such methods. In this Health Data Management article, Dr. Hill asks whether healthcare data science and predictive modeling could be similarly automated? And what exactly would that look like?



    Dell Information Management Group Shawn RogersVideo clips from London's Cloud World Forum 2015
    by Shawn Rogers, chief information officer, Information Management Group

    Speaking recently at Cloud World, Shawn addressed the great business opportunity afforded by hybrid data environments, where the cloud presents an interesting convergence of technologies and capabilities that enable data processing from almost anywhere—often with the purpose of applying advanced analytics that lead to insights not previously understood.



    Dell Statistica John Thompson Thirst for Advanced Analytics Driving Increased Need for Collective Intelligence
    by John K. Thompson, general manager advanced analytics, Dell Software

    In his latest article at, John Thompson explains that the data scientist skills gap will not deter data-driven organizations from achieving the benefits of predictive analytics, thanks to their willingness to pursue collective intelligence as a practical, collaborative workaround that is powerful enough to "change the world."



    Dell Information Management Group John Whittaker
    Three Tips for Surviving Today's Complex Data Landscape
    by John Whittaker, executive director, product marketing, Information Management Group

    In this article contributed to Data Center Knowledge, John acknowledges that many organizations collect a disparate mix of structured and unstructured data, and he spells out three information management priorities for DBAs to maintain efficiency and achieve successful integration with analytics downstream.





  • Direct2Dell

    Opening Doors of Big Data Innovation with IT and Business Alignment

    Deriving value from big data is getting a lot easier, thanks to the continuing breakdown of both technologic and economic barriers. Many people would have you think that big data is a new concept, when, in fact, big data has been around a lot longer than most people think. It just used to take a federal grant to do anything with it.

    Now, however, technology has evolved to where it’s possible to analyze data at the speed of business more economically than ever before. This opens doors of innovation to a much broader swath of organizations that can use information to drive their businesses forward—faster, further and more competitively.

    But to cross the finish line, you need to answer the following questions:

    1. How can I transform my business using data, especially information I couldn’t address in the past?
    2. How can I do better, smarter things with data while driving operational efficiencies?
    3. How do I align technical and business people to fully utilize data and take our business to the next level?

    When I first got into this business, it was enough to figure out how many widgets were sold in a particular region. Now, companies want to know how many widgets were sold in a particular region, in a certain color, to a specific customer, 10 minutes ago. Or, even better, they want to be able to predict the result before it happens. This takes a different approach to information—one that requires IT and business people to be in lockstep before opening the data floodgates. As the saying goes, it takes a village.

    The extra effort to align is well worth it, however, as great things can happen when business and IT leaders are on the same page. At Information Laboratory, a leader in the development, manufacturing and distribution of medical devices, getting there meant giving research scientists and engineers ready access to a wealth of production test data. It also meant that both groups needed the ability to perform analyses of manufacturing, quality control and supply chain information to drive better quality and product innovation.

    With Dell Statistica, analysts throughout the company can help themselves without IT intervention. As a result, Information Laboratory has taken advantage of its organizational intelligence to streamline and improve manufacturing operations. They’ve accomplished this by quickly identifying and fixing any problems associated with producing hundreds of thousands of device cartridges containing a card with a variety of measurement sensors. The bottom line: Information Laboratory has saved hundreds of thousands of dollars by avoiding the need to scrap a single batch of sensor cards.

    Without technology, cost and organizational barriers, companies can drive innovation and deliver collective intelligence to those who need it most. This will be key to achieving success in a data-driven Internet of Things (IoT) world.  Despite what some pundits say, IoT is not a new trend, as machines have been pumping out data for a really long time. Companies like Information Laboratory and others in healthcare, manufacturing, and automotive have been working with sensor-generated data for years. What’s new is the level of connectivity between data sources and the availability of cost-effective technologies and analytics tools that enable companies to do more with their data.

    And, the more you can do with data, the better positioned you are to handle the massive scale required to integrate sensor-generated information with other digital data. Companies will demand the agility to execute analytics, as well as manage and secure it at the edge near the sensor as well as at the core of the IoT/data environment.

    Think about the improvements in healthcare decisions if patients can work more directly with their physicians to fill in information gaps. They can infuse EMR data with self-generated information from their fitness wearables and data gathered at home, such as exercise levels, walking heart rate, recent glucose readings, etc. Doctors can blend that information with vitals data gathered during office visits. Applying industry best practices to the EMR data enables physicians to offer personalized, more effective recommendations designed to improve patient outcomes.

    While the healthcare industry offers plentiful examples, any company that breaks down barriers can achieve measurable big data benefits. One major advantage at these organizations: No one is afraid of their data. Also, IT isn’t always the center of innovation as key business stakeholders—from sales, marketing, finance and customer support—are funding major projects.

    If you can marry your big data, regardless of where and how it’s generated, with crucial business processes, you’ll win. What’s your strategy for breaking down barriers and opening doors of innovation? Connect with me on Twitter at @shawnrog to share your story.

  • Information Management

    #ThinkChat– Patient Engagement… Is It the Answer or Just Buzz?

    Follow #ThinkChat on Twitter Friday, September 18th, at 11:00 AM PST, for a live conversation exploring the importance of patient engagement.

    Patient engagement can mean many things.  Engaging patients through new technologies can help facilitate better communication, education, and collaboration, resulting in better health outcomes.  When there’s an increase in meaningful physician-patient interactions, patients are more likely to get involved in their personal health care.  This may be just one step that can eventually lead to patient empowerment.  Join the conversation and provide input on your perspective of how patient engagement can lead to better health outcomes

    Join Shawn Rogers (@ShawnRog), Chief Research Officer for Dell IMG, Dr. Nick van Terheyden (@drnic1) Chief Medical Officer at Dell; Janice Jacobs (@JaniceJacobs44) Social Media Solution Leader, Dell Services, Stephanie Bartels (@Steph_bartels19) Global Solution Leader - Patient Engagement, Dell Services and Mandi Bishop (@mandiBPro) Health Plan Analytics Innovation Practice Lead, Dell Services for this month's #ThinkChat as we talk with the community about the impact of patient engagement.

    Join @DellBigData and share your own personal stories about patient engagement and follow #ThinkChat and #NHITweek!

    Questions discussed on this program will include:

    • There is a lot of talk about patient engagement. Is it all talk or are healthcare organizations investing?
    • What are the barriers to engaging people?
    • Patient engagement seems like a buzz word, we hear it a lot.  Is it a fad or is it here to stay?
    • What's your point of view on how patient engagement is defined?
    • Is there real ROI from engaging patients?
    • Doesn’t it cost a lot though to work with patients 1:1?  Do hospitals have the resources to do this?
    • What are the barriers to getting clinicians to do things differently with patients?
    • What can we do differently to be more successful with patient engagement strategies?
    • How can we leverage mobile technology to more effectively engage patients?

    Where: Live on Twitter – Follow Hashtag #ThinkChat to get your questions answered and participate in the conversation!

    When:  September 18th, at 11:00 am PDT

    Shawn Rogers

    About Shawn Rogers

    Shawn Rogers is Chief Research Officer for the Information Management Group at Dell Software. Shawn is an internationally recognized thought leader, speaker, author and instructor on the topics of IoT, big data, analytics, business intelligence, cloud, data integration, data warehousing and social analytics. Shawn has more than 19 years of hands-on IT experience. Prior to joining Dell he was Vice President Research for Business Intelligence and Analytics at Enterprise Management Associates a leading analyst firm. Shawn helps customers apply technology to fuel innovation and create value with data.

    View all posts by Shawn Rogers | 

  • Direct2Dell

    Postcards from the Edge of IoT Analytics

    Old postcards illustrate how Dell offers solutions for managing, securing and analyzing data from the data center to the farthest endpoint

    Late last month, I participated on a panel at the IoT Evolution Conference & Expo, entitled “Unleashing value from analyzing data generated by the Internet of Things.” Joining me were Syed Hoda, CMO at ParStream, and Laurie Lamberth, associate partner at 151 Advisors. Even though it was the last day of the conference, we had the good fortune to share insights with a standing-room only crowd eager to learn how real-time analytics could help generate more value from their IoT initiatives.

    It’s crystal clear that IoT can help companies drive significant operational efficiencies and business growth. The trick is figuring out the best way to address the rapidly rising numbers of sensors, embedded systems and connected devices, which are taking data volume and complexity to a whole new level.  

    A recent report from ABI Research estimates the volume of data captured by IoT-connected devices will surpass 1.6 zettabytes within five years. According to ABI, only a fraction of this data is currently being captured for further analysis because the vast majority is stored or processed locally without a way for it to be easily shared across the enterprise to aid decision making.

    Many companies are betting on fog computing to solve this problem by reducing the amount of local data that needs to be transmitted back to the cloud for processing and analysis. Bringing these functions closer to the data source will let companies extend the benefits of cloud computing to their network edge and for faster, easier and more meaningful business insights. That’s where edge analytics come in, as the ability to access time-sensitive, geospatial data opens the door for real-time analysis of data with increased accuracy and context.

    Edge analytics will help fulfill the promise of IoT and be a gating factor for scaling IT infrastructures to reliably capture, store and ensure accessibility to data generated by hundreds of billions and even trillions of devices. The sheer volume and complexity of managing all of this decentralized, localized data can quickly overload traditional environments and analysis tools.  

    Most legacy solutions haven’t been designed to ensure low-latency data access for geospatial workloads at the enterprise’s edge. A lack of protocol standards also complicates cross-domain data sharing while alignment challenges between IT and business stakeholders can quickly derail strategy development and implementation.

    That’s why we recommend architecting for analytics, as the success of any deployment will be tied directly to the quality of the insights gleaned. For many Dell customers, this means having the flexibility to deliver predictive analytics at the core while creating a path for performing data aggregation and scoring at the edge.

    For Smart Start, a Grapevine, Tex.-based leader in alcohol monitoring technology that manufacturers a line of ignition interlock breath analyzers, Dell devised an edge analytics solution for sending near real-time quality data from its production line. The goal: Increase product quality throughout the company’s supply chain. The solution: A multi-tier automation and data management system that collects data from each of the assembled products, analyzes it using custom algorithms, then aggregates it from multiple manufacturing sites into the cloud so the latest, most accurate details can be presented in reports and visualization tools.

    The key to the success of this—and any—deployment is using modular, architecture-agnostic solutions that scale quickly from pilot to production. Of course, bolstering security is critical as an exponential increase in connected devices introduces an exponential increase in security risks. Dell puts security first to ensure our customers don’t end up with the “Internet of Compromised Things.”

    At Dell, we practice what we preach with solutions for managing, securing and analyzing data from the data center to the farthest endpoint and along all the networks and clouds in between. We suggest starting small and building on current technology investments and real-world successes. Luckily, our customers are well positioned to take advantage of Dell’s end-to-end hardware, software and services framework to build secure, extensible, supportable, expandable and configurable IoT solutions today.

    What are you doing to make the Internet of Things real…today? How do you plan to deploy edge analytics and unlock greater value from your data? Connect with me on Twitter at @alertsource to join the conversation.

  • Statistica

    Using Dell Software Support for your Statistica products

    We invite you to take a few minutes to explore how the Dell Software Support Portal and a host of tools and capabilities are easily accessible to help you utilize your Statistica products and engage with its experts 24x7x365. From one central location, you will find everything you need, including:

    In the event you do need to contact technical support, you can submit a Service Request via the support portal for the quickest and most effective means of connecting with your regional support representative.

    Opening a Service Request online ensures:

    • Faster response times – Once we receive your request through the support portal, we’ll get back to you as quickly as possible with the assistance you need.
    • Streamlined case management – All of your essential details about your case are submitted online including attachments like log files, screenshots, reports, or trace logs.
    • Eliminates telephone hold-time – We ensure your case is routed to the most qualified engineer without having to wait on hold.

    We are also excited to announce the new Statistica User Discussion Forum, where savvy minds are invited to post content and questions about all things analytics. Our community forum is regularly monitored by Dell experts and peers to provide best practices, seek feedback, and make product suggestions. We look forward to your participation!

    For more information, visit the Dell Software Support Portal!


  • Information Management

    How a Hybrid Solution Addresses the Crisis of Data Speed

    Subscribers to our Statistica Monthly Newsletter have already been made aware of the latest EMA/9sight survey results that found data-driven businesses are becoming more interested in speed than volume. This shift in focus will change market dynamics, and that’s what the survey’s executive summary is all about.

    When you read the summary report, you will learn some interesting things:

    • Speed Is Driving Competition – Speed of processing response was the most frequently indicated use case by respondents, at nearly 20%.
    • Time to Value with Applications – Over 20% of respondents implemented big data projects using customizable applications from external providers.
    • Low Latency is High Profile – Big data projects are overwhelmingly near real-time, with over 32% described as real-time or near real-time processing of data.
    • Two-time Use Case Champion – For the second year in a row, the top use case for big data initiatives is speed of processing response, at nearly 20% of mentions.

    Clearly, a growing portion of respondents are feeling the need for speed.

    Of course, speed is intrinsically related to volume and data structure. It is because of the growing volume and variety of data—especially with the onset of the Internet of Things—that data collection and preparation now require extra attention so that time-to-value (i.e., speed) can be maintained or improved. It is also true that not every business is ready to roll with 100% all-new infrastructure (hardware + software + sensors + workflows) to handle all this change from Day One, which means that most—if not all—businesses are likely implementing their data-driven strategies in piecemeal fashion, with a mix of old and new technologies plus a wish list for more.

    This is a good place to mention the “Hybrid Data Ecosystem” as a valid means of addressing the speed issue. EMA originally defined the big data Hybrid Data Ecosystem (HDE) several years ago through end-user surveys and interviews with technology thought leaders, implementation specialists, and software vendor experts. Each platform within a HDE supports a particular combination of business requirements along with operational or analytical processing challenges. Rather than advocating a single data store that supports all business and technical requirements at the center of its architecture, the HDE seeks to determine the best platforms for supporting particular sets of requirements and then links those platforms together. In this sense, HDE makes the most of the messiness of reality and the overlap of various technologies that exist side-by-side in many businesses today.

    Let’s face it: conversions, migrations, and upgrades don’t happen overnight and usually involve transition periods that may last into perpetuity. Accordingly, the Hybrid Data Ecosystem is constantly refined. This year, for instance, EMA expanded the HDE scope to include the influence and impact of the cloud on big data environments and data consumers.

    You’ve simply got to see the comprehensive infographic that represents the latest HDE model, and read how Dell Software’s big data offerings (including Statistica) map to it. Click the image below to get to the report.

  • Information Management

    #ThinkChat – Real-Time Analytics: Myth or Right Around the Corner?

    Follow #ThinkChat on Twitter Friday, September 4th, at 11:00 AM PDT for a live conversation and discover how your peers are using real-time data and analytics!  

    The state of analytics is evolving fast and while more people within the business are utilizing and relying on the value that analytics presents, new demands for faster insights are stretching our traditional analytic infrastructure. Real-time analytics are an exciting opportunity for many companies. Do you have the architecture and tools you need to match the speed of the business?

    Join Shawn Rogers (@ShawnRog), Chief Research Officer for Dell IMG, and Joanna Schloss (@JoSchloss), Dell Software's Analytics Thought Leader and special guest Dean Abbott (@DeanAbb) Chief Data Scientist at SmarterHQ, for this month's #ThinkChat and talk with the community about  how real-time analytics is impacting your business.

    Join us and share your own personal stories about real-time data or analytics!

    The #ThinkChat Agenda Includes:

    • What solutions/tools do you or your organization use for real-time analytics?  What are your favorites?
    • How does real-time affect your day to day decision making?
    • Do you see a real need for real-time analytics?
    • What hinders you or your org from achieving real-time decision making with real-time data?
    • Promote your favorite sites/brands - where do you go to meet analytic professionals?
    • Do you have any favorite books or blogs on big data and real-time analytics?
    • What is your opinion on where or how real-time evolves? Where is it going from here?
    • Is your architecture ready to support real-time analytics?
    • Are there specific workloads that are moving you toward real-time analytics?

    Where: Live on Twitter – Follow Hashtag #ThinkChat to get your questions answered and participate in the conversation!

    When:  September 4th, at 11:00 am PDT

  • Information Management

    Drivers Wanted: Statistica User Forum Now Open

       Image credit: Pat Herman

    In the most recent issue of our Statistica Monthly Newsletter (yes, you can subscribe for free), our readers were made aware of a new Statistica user forum in our community pages. The new forum is intended to be a true user-to-user community, with discussion threads driven by the users, of the users, and for the users.

    The good news is you don’t have to be a Statistica guru to participate! However, this forum does provide you the opportunity to share your best practices, seek feedback on vexing challenges, make product suggestions and expound on specific analytics and data topics that interest you. You can promote yourself by linking to relevant blogs and articles you have written in other forums, too, such as LinkedIn groups. This totally new forum will be regularly monitored by Dell experts and peers alike, so you can anticipate that your posts will be addressed even as we build our new community audience from scratch.

    Why is this a big deal? Because the development of the Statistica platform itself is a response to your real-life use cases and the industry trends that affect you. And because you can improve your own knowledge base (and your personal brand) by collaborating with fellow Statistica users. You never know where the next big idea may come from. Here I will happily defer to greater minds than my own:

    • “Alone we can do so little; together we can do so much.”
      ― Helen Keller
    • “You can't develop all the competencies you need fast enough on your own. Furthermore, if you don't collaborate, your ideas will be limited to your own abilities.”
      ― Vishwas Chavan
    • “Many ideas grow better when transplanted into another mind than the one where they sprang up.”
      — Oliver Wendell Holmes
    • “Share your knowledge. It’s a way to achieve immortality.”
      — Dalai Lama (1357-1419)
    • “The only thing to do with good advice is to pass it on.”
      — Oscar Wilde
    • “All knowledge is connected to all other knowledge. The fun is in making the connections.”
      — Arthur Aufderheide

    The sun never sets on the Statistica empire, because there are over 1 million Statistica users in dozens of countries around the globe, in industry and academia and government. As a Statistica user, you are never alone. So share the forum link with your peers, and we look forward to your participation.

    For more information, subscribe to the Statistica newsletter >

  • Information Management

    Speaking from Experience: Surprise Processes During a Major Analytics Migration

     Months of planning complete, hardware and software procured, associates prepped. The sails are set, and the stars are aligned to flip the switch on a major analytics platform migration. That’s what the buildup felt like when Dell was ready to start moving users from our legacy analytics platform to Statistica, an easier-to-use and lower cost solution, as the company’s analytics platform.


    In a previous blog post, we covered the lessons learned from process planning. It bears repeating that the time Dell spent working through process-oriented questions positioned the organization to start moving users onto Statistica on schedule. Even still, when the actual migration is in progress, some new and perhaps surprising processes pop up.


    What actually happens when a company migrates to a new analytics platform? Find out in the Dell-on-Dell case study. The e-book, “Statistica: The Great Analytics Migration,” is available for download.


    During our migration, Dell realized that it was the business leads in the Centers of Excellence (CoEs) that had the best glimpse into progress, knowing how close each user or department was to migrating completely off the legacy analytics platform. The CoEs also had the insight into unexpected roles and tasks users and managers took on along the way. Let’s look at three:


    Working in two platforms: Perhaps it’s not surprising that a migration of this magnitude would put added strain onto employees taking part in the migration, but there are only so many hours in a workweek. If there is an expectation on teams to add more tasks onto the daily workflow, plan deadlines accordingly.


    Double checking, for a while: Since analytics are pervasive at Dell and run mission-critical business applications, to ensure the integrity of results and minimize risk during the migration, Dell ran Statistica and the legacy analytics system concurrently to make sure everything was operating as expected. It was a process that was unexpected and time consuming but necessary before the legacy analytics platform could be turned off.


    Trying to align individual business groups: You’ve heard of the phrase herding cats. It’s certainly a good comparison to managing a migration in which various groups operate on their own timeline, working toward their own objectives. But success means getting all groups to completion by the overall deadline.


    During the migration, we encountered some additional necessary processes to move the migration along. For instance, the team realized it was important to stop and correct inefficiencies, despite the reluctance to take any time away from moving forward. Managers also experimented with different motivation tactics, including contests. To find out more about the actual migration, download the Dell-on-Dell case study, “Statistica: The Great Analytics Migration, Part 2: Process,” which recounts ways to get all teams to stick to the deadline. Would you expect pressure in your organization to come from the business leads or IT?


    There’s more to come from Dell on its Statistica migration. In part 3 of the e-book, we’ll cover all aspects related to technology components of the migration project — from architecture to tooling. In the meantime, read part 2 to get more insight into our migration process. 

  • Information Management

    The Happy Connection between Statistica and Toad

     Subscribers to the Statistica Monthly Newsletter already received a heads-up about several events coming over the next few weeks, including the combined Statistica-Toad tech webcast on August 27, called, “The Smart Data Analyst’s Toolset.”

    Maybe the Toad name doesn’t mean much yet to longtime Statistica Enterprise users, but it will. That is because Statistica—historically a very robust and IT-friendly analytics platform on its own merits—is now integrated with Toad Data Point and Toad Intelligence Central (TIC), fellow members of Dell Software’s information management portfolio.

    What does Dell Toad do for you and why should you care? Well, that’s what the webcast will cover in detail, so let’s just summarize here by saying Toad opens your door to a whole new world of big data interconnectivity upstream of Statistica. Toad Data Point offers self-service data access, integration, and data preparation functionality with relational and non-relational sources. (And who wouldn’t be happy with more tools for handling data preparation, one of the leading sources of posterior pain among data analysts?) Meanwhile, on the server side, TIC seamlessly connects data, users, and files for well-governed collaboration.

    Once again, this is Statistica’s way of “playing nice” with your existing IT assets and software components, a practical trait for which our platform has long been hailed by satisfied users worldwide. If you already have Dell Toad in place, now you can call Data Point and TIC directly from within Statistica Enterprise. If you do not yet have Toad in place, you really should be taking a look at Toad now!

    Either way, the August 27 webcast will showcase the new Statistica-Toad connection that enables you to provision data sources across platforms like never before in order to produce a single system for data profiling, cleansing, analysis, modeling, deployment, and scoring.

    A powerful connection, indeed.

    For more information, subscribe to the Statistica newsletter >

  • Information Management

    The “Process” to Take the Pain out of Migrations

     Uttering the word “process” will likely send a shudder down a business or IT pro’s back, anticipating the planning, resources, timelines and deadlines all part of said process. Despite resistance, in many cases and especially when facing a major IT migration, it’s the process that ensures all stakeholders are satisfied by the result.


    Dell recently migrated its entire legacy analytics platform to Statistica – and we’re hoping our experience will help other companies in their own migrations. In Part I of the Dell-on-Dell e-book about our Statistica deployment, we reviewed the steps we took to get our people – all associates that touch the analytics platform – on board at integral parts of the migration.


    The fear of a major migration shouldn’t stop your organization from deploying a better solution. Learn how Dell moved to a new analytics platform in the e-book, “Statistica: The Great Analytics Migration.”


    In Part II of the e-book, we’re addressing several process-orientated challenges and questions. A few important process lessons learned:


    The migration “process” starts before you even realize – and no one likes surprises: Even if the executive staff or IT decision-makers share plans with all stakeholders as soon as the project is confirmed, anticipate that some savvy stakeholders already suspect a change afoot. For Dell, our associates expected a change when Dell acquired Statistica. For other companies, it could be a poor performing analytics platform or a change in executive leadership which could signal a migration. Either way, informing those involved sooner rather than later will limit the number of people caught off-guard.


    Investing time in laying the groundwork is well worth the effort: Before a single action was taken, Dell pooled every available resource from the Statistica team to truly understand the platform and IT requirements. That process alone took a month. But it was valuable time spent to plan and align expectations. Better informed stakeholders could more quickly and accurately answer other process questions:


    -          How long with migration take?

    -          How many users really need to migrate?

    -          How fast can we be up and running?


    Centers of Excellence are monumentally important: Creating Centers of Excellence (CoE) sounds like a process in of itself, doesn’t it? However, it’s these groups of stakeholders organized by business function that help the organization with a migration overall. CoE provide valuable insight on how the project can be helped by or can help each business function. At Dell, our CoE identified analytic platform users that should be part of the migration, which helped our team resource appropriately.


    Process-oriented planning is tough and it’s a challenge to get all associates on board with the rigor necessary to make a migration successful. But the advantages of process planning far outweigh the perceived time savings expected to be gained by rushing through a migration. In fact, with pre-planning Dell was able to get hardware online and ready to accept users in a five week period – a task that should take 3 months! 


    For more insight into how Dell ticked through the process-oriented questions we faced, download the e-book, “Statistica: The Great Analytics Migration, Part II: Process.” Whether your organization is migrating 10 users or 10,000 users in a 6-week or 6-year project, our answers may help your migration process go smoothly.

  • Information Management

    #ThinkChat- Big Data: Where Security and Innovation Meet

    Follow #ThinkChat on Twitter August 13th at 11:00 am PDT for a live conversation about how big data innovation is reshaping enterprise security!

    Join Shawn Rogers (@ShawnRog), Chief Research Officer for Dell IMG, Joanna Schloss (@JoSchloss) Dell Software’s Analytics Thought Leader and John Whittaker (@Alertsource) Executive Director, Information Management Group at Dell for this month’s #ThinkChat as they discuss how big data innovation and security are creating opportunities to excel and new challenges for the enterprise.

    Tweet with us about how big data represents innovation and many companies are jumping on the band wagon to be first to gain competitive advantage using new data type, insightful applications and new data frameworks while trying to adapt to new security concerns, ricks and best practices.

    Join the conversation!!

    The #ThinkChat Agenda Includes:

    1. Is big data a challenger or enabler to enterprise security? Do today’s new solutions enable us to thrive?

    2. Where do privacy and big data meet? Innovation can breed issues that go beyond security. What precautions are critical?

    3. Are there best practices for non-relational data sources with regards to security?

    4. Are big data solutions like Hadoop enterprise secure? Do they meet the needs of today’s business?

    5. Do new big data frameworks create opportunity for better security analytics or create complexity?

    6. IoT is exciting but it opens the door for greater security challenges. What are the top best practices?

    7. Speed is at the heart of many security analytic scenarios, how does big data create value when speed is critical?

    8. What are some Big Data factors that change the way we approach data protection?

    9. How have your data protection needs changed over the past 2-3 years?

    10. Is there a difference between securing and governing big data versus traditional data?

    Where: Live on Twitter – Follow Hashtag #ThinkChat to get your questions answered and participate in the conversation!

    When:  August 13th, at 11:00 am PDT

  • Information Management

    #ThinkChat - The Sexiest Job of the 21st Century with @ShawnRog and @Tdav

    The role of the data scientist has become a bit of a legend in the analytics industry these past few years. Many of my DBA friends have gained instant career advancement with self-appointed promotions by adding this job title to their LinkedIn profiles leading to raises and general admiration from their peers. Tom Davenport and D.J. Patil wrote about this topic back in 2012 for the Harvard Business Review and it caused everyone to take notice of this analytically driven job. The role of the data scientist has evolved these past 3 years and so has the definition. 

    In this week’s #ThinkChat segment Tom and I discuss how the role of data scientist is changing, where they fit in an organization and whether or not everyone who has stealthily added the title to their resume owes Tom a cut of their new found wages!!

    #ThinkChat Conversation with Tom Davenport Part 7 of 7

    To view other segments in the #ThinkChat series click here.   

  • Statistica

    Thought Leaders in the Mix - August 2015

    Our subject matter experts for Statistica and the Information Management Group (IMG) keep busy, staying abreast of current trends with big data and small data, predictive software and real-world analytics solutions. And they frequently comment in industry publications, professional forums and blogs--or produce videos, in Shawn Rogers' case. Here are a few of their recent articles.
    Dell Information Management Group Shawn Rogers Video: Analytics 3.0 meets analytic amateurs
    by Shawn Rogers, chief information officer

    This short video is part of Shawn's #ThinkChat video series with Tom Davenport, professor of management and IT at Babson College. Here the two briefly discuss how the landscape is evolving and how a growing community of advanced analytic users and enablers are fueling change. It’s the age of the analytic amateur and the semi-pro!



    Dell Statistica John Thompson Getting Real About Virtual Centers of Excellence
    by John K. Thompson, general manager for advanced analytics, Dell Software

    John takes a look at the many ways organizations can benefit from a decentralized, collaborative approach to analytics, an approach made realistically possible for more and more companies with the advent of simple, cloud-enabled tools.



    Dell Information Management Group David Sweenor
    Analytics Migration Series: Anticipating Business User Reaction
    by David Sweenor, analytics product marketing manager

    Change can be daunting, especially when it involves unfamiliar technology to accomplish daily tasks. So when an entire workforce must migrate to a new software platform after years with legacy code, what kinds of questions do they ask? And how are their fears replaced with curiosity? David Sweenor introduces the first chapter of a three-part e-book describing Dell’s own recent migration.


    Dell Statistica Thomas Hill
    A glimpse at the future of predictive analytics in healthcare
    by Dr. Thomas Hill, executive director analytics

    In his latest article published at, Dr. Hill discusses the technology revolution that will involve predictive analytics in thousands of healthcare applications and workflows, and he shares his perspective regarding the industry's various opportunities, disruptors and hurdles.




  • Information Management

    In the World of Big Data Analytics, Azure Is More Than Just a Pretty Color

    In the latest issue of Statistica Monthly News (yes, you can subscribe for free), our readers found a link to a webcast that talks all about Statistica’s new partnership with Microsoft, a relationship that produces some incredible hybrid cloud functionality for data analysis using Azure Machine Learning (ML).

    We are talking about a hybrid cloud solution whose powerful functionality completely belies Azure’s namesake: a shade of bright blue often likened to that of a cloudless sky. Cloudless? Hardly. The Statistica-Microsoft partnership is all about the Cloud!

    The fun story in the webcast describes how one website was running an analytics program as an API on Azure. Designed to guess ages and genders of people in photographic images, the site was expecting a few thousand submissions, but it went from zero to 1.2 million hourly visitors within just two days of going live, and up to seven million images per hour. By day six, 50.5 million users had submitted over 380 million photos! Normally, we would hear about sites crashing with such a viral overload. But this site kept humming along even when the action ramped up so dramatically, primarily because Azure scaled dynamically as intended, handling the unforeseen load like a champ.

    Think about embedding this kind of cloud access and flexible scalability as a directly callable function inside Statistica—well, that just makes way too much sense, right? But that is what’s happened! Azure ML is really a development environment for creating APIs on Azure, with the intent to enable users to have machine learning in any application, whether that is a web app or a complex workflow driven by Statistica. For instance, you can host your complicated models in the cloud with Azure and run non-sensitive, big data analytics out there—a very practical time saver and money saver. Then you can bring those analyzed results back down to join perhaps more sensitive data and analytics output behind your firewall. You can learn more when you watch our “Cloud Analytics” webcast

  • Information Management

    As a Slogan, “Big Data” Still Carries a Big Punch

    You might have heard that the term "big data" is over-hyped. And maybe you have heard it already slid into the “trough of disillusionment” (as far back as early 2013, if you believe everything you read on the Internet). Even assuming these assessments are true, the fact remains that the term itself remains relevant and apt for many a business person still seeking best tips and practices for developing analytics projects.

    In marketing-speak, the “big data” slogan has stickiness. But for all its ubiquity—or, perhaps, because of it—the term "big data" remains something of an enigma, a source of curiosity for business leaders and data executives worldwide. That is to say, business people wanting to get into analytics still respond to that term more than others.

    To be fair, the longevity of “big data” works in its favor. Currently, people search for “big data” on Google an average of 60.5K times per month, perhaps because the term seems all-encompassing and broadly descriptive, a good place to start asking questions. Meanwhile, more recent phrases—despite their own merits and relevance—are not sought out nearly as often. For instance, “internet of things” currently averages only 40.5K monthly searches, and “predictive analytics” clocks in at 9.9K. And even if you think “cloud analytics” is destined to be the Google rage someday, right now that phrase averages only 390 searches per month. (That’s not a typo: it is 390.)

    This popularity is why we still like to use the “big data” moniker when talking about Statistica’s analytics prowess. Did you read our July issue of Statistica Monthly News? (Yes, you can subscribe for free.) In the sidebar list of events, our subscribers have already seen that we are offering a free Tech Webcast on July 30, “Statistica Eats Big Data for Breakfast.” This webcast will be presented by Mark Davis, the founder of Kitenga and now Distinguished Engineer at Dell Software. He will be focusing on the newer big data capabilities within Dell Statistica 12.7 and how those capabilities can benefit businesses in a variety of use cases, perhaps even in your industry.

    Register today and spread the word!

  • Information Management

    Analytics Migration Series: Anticipating Business User Reaction

     Dell’s SAS migration began shortly after we acquired the advanced analytics product Statistica. Within weeks, we had decided to move all of the company’s analytics users from SAS to Statistica. After assessing how the migration would affect employees, the next challenge was to get everyone on board.


    Change can be daunting, especially when it involves embracing unfamiliar technology to accomplish daily tasks. Our main strategy for getting employees on board was to replace fear of an unknown product with curiosity about how best to accomplish analytical tasks with it.


    Download the e-book, “SAS to Statistica: The Great Dell Migration – Part 1: People”


    Understanding the Reactions

    You don’t expect any sweeping change to be widely met with open arms and high-fives, so we were certainly prepared to address concerns from the Dell workforce. When the news was announced, most reactions fell into three buckets:

    • “But we’ve never used Statistica.” Our co-workers weren't familiar with how robust an analytic platform Statistica is, so naturally they were skeptical. That’s why we hired them. They knew that their work consisted of high-end analytics in SAS and assumed (incorrectly, as it turned out) that Statistica wasn’t up to it.

    • “We’ve spent years writing thousands of lines of SAS code. We don’t want to just throw that away.” Our users balked at all the work of trying to replicate in Statistica the advanced analytics functions they had built in SAS. Who wouldn’t feel that way?

    • “We consider ourselves SAS professionals and analysts first, and employees of Dell second. For career longevity and our ability to do our job, we believe that it's really important to continue using SAS.” That’s a tough one. We found a number of heavy SAS users who had been working with the product for over 20 years. They were comfortable using it and they had grown, evolved and become pretty good with it over much of their career. Asking them to switch to something they didn't know was a huge imposition.


    Most users had never heard of Statistica and many of them felt a deep-seated career-attachment to SAS. Once we realized that, we started working on ways to replace their fear of an unknown product with curiosity about Statistica.

    Addressing the Concerns

    It was our responsibility to arm employees with as much knowledge about Statistica as possible. We began by arranging communication between our employees and our migration leads from Statistica, to show them that their long years of work would not simply be discarded.

    The leads examined the techniques and functions our users had worked with in SAS – K-means clustering, polynomial regression, GLM, ARIMA, neural networks, and more – and demonstrated how to replicate and enhance them in Statistica. Nearly all the techniques they had used in SAS were in Statistica, and were easier to implement. In short, they didn’t need to rewrite thousands of lines of code; they simply dragged and dropped icons on the Statistica workspace.

    While the discussions eased their concerns somewhat, getting full-scale buy-in required more comprehensive onboarding. We’ll take a closer look at those strategies in an upcoming post.

    Download the Dell-on-Dell case study, “SAS to Statistica: The Great Dell Migration – Part 1: People,” to learn more about anticipating the reaction among your business users when you undertake an analytics migration.

  • Information Management

    #ThinkChat Innovating with Data Driven Products with @ShawnRog and @Tdav

    New data and new insights are giving way to new data driven products and contributing to the digital economy. Companies with data driven insights to markets, buying behavior, product performance, and procedure execution are able to leverage this insight to produce and build new innovative products based on this data. These new data products can create meaningful revenue opportunities and enhance customer care and overall corporate execution.                                                                                               

    In this week’s #ThinkChat segment Tom and I discuss how companies like GE, Monsanto, Google and Facebook are leading the way with data product innovation and how traditional smaller companies can get in on this opportunity to turn their data into new services, products and revenue streams. 

     #ThinkChat Conversation with Tom Davenport Part 5 of 7      

    To view other segments in the #ThinkChat series click here


  • Information Management

    What Do Sy Sperling, Victor Kiam and Michael Dell Have In Common?

    Those of a certain age may recall a famous ad campaign where the well-coiffed Sy Sperling stated, "I'm not only the Hair Club president, but I'm also a client."

    And who can forget pop culture icon, Victor Kiam, of Remington Products fame: "I liked the shaver so much, I bought the company!"

    Now add Michael Dell to that lineup. After adding the Statistica analytics platform to the Dell Software portfolio in 2014, the company founder and namesake decided it would be smart to roll out the product in-house in an ambitious proof-of-concept showcase that would answer the question on the minds of many a CTO: Is it possible to migrate a longstanding, major SAS shop to Statistica without needless disruption to culture or infrastructure? Our company answered that question—switching about 300 users and converting 300+ projects across multiple business units in only six months.

    Whaaaat? You didn’t hear about this amazing feat? Our Statistica Monthly News subscribers already read about this. (Yes, you can subscribe for free.) In the technology world, using your own product is known as eating your own dog food or drinking your own champagne. And that’s exactly what Dell is doing.

    Maybe you have been wondering whether your own company could relieve itself of the costs and complexities of SAS. Well, now you can wrap your brain about that concept by reading exactly how Dell did it. Click through to our July newsletter below and look at the lead story there. While you are at it, be sure to subscribe to the newsletter so you can keep up with other useful info as we move forward.

    And remember: “I’m not only the Statistica newsletter editor, but I’m also a subscriber.” 

  • Information Management

    Analytics Migration Series: Preventing Disruption of Data Analytics

    In 2014, Dell acquired the advanced analytics product Statistica, and set out on a project to migrate all of the company’s data to our newly attained solution. In our Analytics Migration Series, we’re taking a closer look at our journey in hopes of offering insight to other organizations embarking on this arduous but sometimes necessary process.


    One of the first tasks we faced was analyzing who would be impacted by the migration and defining their job functions. Dell is no different than most companies in that with any software usage there are levels of engagement. Most employees are casual users, working with a subset of the product functions to accomplish their daily tasks. Then, there is a smaller subset of power users who eat, sleep and breathe the product, who will be most affected.


    Download the e-book, “SAS to Statistica: The Great Dell Migration – Part 1: People”


    We quickly identified hundreds of users whom we needed to move to Statistica. The largest subset of those users work in Dell Global Analytics (DGA), a group that provides analytic expertise and support to a wide range of functional organizations throughout the company that don’t have their own internal analytics expertise. Here’s an overview of how the DGA team’s expertise impacts the company: 


    • Supply chain. The DGA provides useful insight that improves our manufacturing by predicting potential disruptions to supply chains around the world.
    • Technical support. Our services teams embed critical data into customer solutions as part of services engagements and for preventative maintenance on hardware.
    • Financial services. DGA provides critical analytics for modeling, assessing credit risk and detecting fraud. Their models are closely tied to forecasts and bank rates, so statistical analysis is part of what they do day in and day out.
    • Marketing. The BI utilized by marketing gives clarity to what customers and prospects are saying, tweeting, looking for and buying. We use analytics to personalize offers, attract prospects and keep existing customers. We study things like customer churn, cross-sell and upsell opportunities.
    • Risk management: Not all decisions are perfect, and we use DGA analytics to lower the potential cost and risk of the mistakes we know we’re going to make and the probability that those risks may actually occur.


    In short, analytics is pervasive throughout Dell and DGA is a big part of our competitive advantage, and the migration affected our data analysts and users in all of those groups.  But more important, since analytics is ubiquitous at Dell and embedded in our systems and processes, there are many more people who consume and rely on the analytic output. Any adverse change to this analytic output could drastically impact the business.


    As a result, we had to take extra care to ensure the DGA team was fully on board with the process and make certain that their ability to deliver this business-critical data across the organizations was not hindered. This step may be challenging and time consuming, but it increased the chances of a successful data migration and minimized potential business disruptions.


    Download the eBook, “SAS to Statistica: The Great Dell Migration – Part 1: People,” for more insights into embarking on your own analytics migration project.