Blog - Post List
  • Statistica

    Is Statistica 13 Really All That Great? (Duh.)

    Probably every IT department has at least one cynic who believes that every software maker touts every new release as something earth-shattering. After all, why give software a new number if it doesn’t represent a quantum leap of some kind, right? However, it is arguably true that some releases may disappoint the masses while others may justify their sequential numerations. So, skepticism may be a healthy way of self-regulating one’s expectations.


    How does this apply to Statistica 13?

    Having said all that, you probably expect that I will now claim the new Statistica 13 really is earth-shattering (it is!) and that you should simply take my word for it (you should!) There actually are specific capabilities within this release that make touting its merits a very easy assignment. However, “earth-shattering” remains a subjective term, so I should not be so crass as to insist you take my word for anything.

    Instead, I will gladly let others make that case for me, because this Statistica release is very impressive and people are taking notice. Our newsletter subscribers already received a headful of headlines about Statistica 13, big data, and the Internet of Things (IoT) generated from our recent Dell World event in Austin, TX. Maybe you've run across these headlines yourself in other venues:

    On top of Dell’s own press release, these six articles are but a drop in the bucket of media coverage, but I can tell you that what got journalists and analysts really excited about the Statistica 13 rollout is our software’s application of Native Distributed Analytics (NDA), with which Statistica saves time and effort by pushing algorithms and scoring functionality into your databases, basically analyzing your data — even big data — right where it lives. Statistica 13 distributes analytics anywhere on any platform. When it comes to dealing with streaming data and transfer limitations, NDA will fast become a busy analyst’s best friend.

    Meanwhile, you just know there other enhancements in Statistica 13 that will make the user experience more enjoyable and productive with respect to data visualization, workspace GUIs, and more. After all, we had to pack in enough newness to justify that new number 13, right?

    Today is a good day to check out Statistica 13 to see what it can do for you and your business. Also, be sure to subscribe to the Statistica newsletter to keep abreast of our latest product info and thought leadership.

    Read the Oct/Nov Statistica Newsletter >

  • Statistica

    Free Statistica at College: The Gift That Keeps On Giving

    We've all been to college at one time or another. Some of you reading this post are still in school even now. And the majority of us are probably still paying off student loans.


    Speaking of college costs, maybe you have already learned about Dell Statistica's response to students in need. Our answer: FREE academic software!

    Major Costs Add Up at School

    Ponder your college years for a moment. Good times and challenging courses. But let’s focus on the struggle of the whole college experience ROI. What are your top complaints in this regard? If they relate to costs, you are in broad company. A nationwide survey of higher education students reveals a list of popular complaints, with a measurable percent stemming from costs:

    • “The price of textbooks!”
    • “College is too expensive.”
    • “The cafeteria food is gross.”
    • “Being broke all the time.”

    Okay, we can't help you with the cafeteria food, but you'll notice the other complaints are indeed about costs.

    Additionally, a plurality (39%) of respondents to Princeton Review's recent "College Hopes & Worries Survey" said their biggest concern is the level of debt incurred to pay for a degree.

    It comes as no surprise that everything at college costs more money than we like, and it all adds up. Consider textbooks alone, the bane of every undergrad out there. Costs vary greatly from one major to the next, but assuming new book purchases are required, a study based at University of Virginia indicates that a statistics major is neither the most nor least expensive when it comes to textbooks. However, the study did find the average statistics textbook costs about $110, and students must buy multiple textbooks throughout that major's curriculum. The most expensive statistics book topped out at $342.

    And, as if that weren’t enough…students in the data sciences get to tack on the cost of basic analytics software, too. It's like buying a virtual textbook on top of the physical textbooks.

    What is the skills gap?

    Meanwhile, though it may vary from industry to industry, the data scientist skills gap is real. Even as long ago as 2011 McKinsey & Company was already reporting that there will be a shortage of talent necessary for organizations to take advantage of big data. Barring some kind of change in the human resources supply chain, they predicted by 2018 “the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.” This is great news for students looking to break into this career path.

    Change that Matters

    So, our free academic program in North America is the kind of “change” we can apply readily to impact that human resources pipeline at the university level. It may not sound like much, but remember that every little bit helps when we are talking about reducing the financial burden of students seeking a strong foundation with skills-based training and key software tools in order to increase their value in the competitive data science field.

    Think about it: The world needs more statistics and data science graduates to handle the deluge of big data challenges that are developing in every industry. Would the cost of just one more textbook—or, in this case, an analytics software package required by the professor—make or break the average student's ability to pursue the degree? Why risk it? We'll just give it away and let the chips fall where they may! If we choose to give away some software to help put more problem-solvers into the world’s workforce, then that's what we will do.

    And the value of such a program? Priceless! Not only is the free academic bundle a boon to the study of analytics in North American academia, but because it will expand the pool of graduates qualified for real-life analytical pursuits across industries, the effects of this program are literally immeasurable, with potentially world-changing impact. You just never know where the next genius case study will originate. Truly, the gift that keeps on giving.

    Read the Oct/Nov Statistica Newsletter >

  • Dell TechCenter

    Halloween Fun with the Dell Software Team

    From parties to haunted houses, trick-or-treating to giving out candy to the neighborhood kids, it’s personally always been a favorite holiday of my wife and I. We even set the date we would exchange our vows so that we would be on our honeymoon in New Orleans during Halloween. And let me tell you, if you are a fan of the holiday, I highly recommend being there for the event. 

    Halloween Fun at Dell Software

    The Dell Software team enjoys Halloween as well. As you can see, many of our families go all out on this entertaining holiday.


    Cynthia was the famous pop art piece by Roy Lichtenstein and had a great Nightmare Before Christmas pumpkin. 


    Madison hung out with friends she's known since high school at a Halloween house party. She's the  one in the center dressed as a witch in black.


    Jeanie and her husband Jeff and had a blast at her neighbors annual Halloween Dance Off party. Jeanie's daughter was a baby cheetah, and she was Momma Cheetah.


    Gio's dog has great Halloween spirit (and Gio has great photo editing skills!)


    Chris spent Halloween day coaching his daughter and other special-needs baseball players from  District 62 Little League at Angel Stadium. The Challenger Baseball Classic is an annual event  where special needs baseball teams from southern California get to play a game on the field at Angel Stadium. (amazing way to spend your time!)


    Emily went to a local Halloween maze and I think her face says it all. She had a good time.


    Amber's daughters went as Queen of Hearts and Sweety Kitty. She calls this photo “The girls just wanna have fun!”


    Ryan had a fantastic Harry Potter Party with his friends, complete with Sorting Hat and their very own Quidditch Cup!


    My son and I dressed up as hackers, while my wife, quite a fan of the horror genre, went as a character from You're Next.

    Halloween Content from Dell Software
    To top this off, the Halloween fun doesn't stop with our customs and parties, it also crept into a number of content offerings published around the season. 

    Switching Analytics Platforms – A Process Nightmare?

    The Windows group had:

    And Foglight had a great video with 2015 Foglight Dashboard refresher:

    We would love to hear about your Halloween adventures. Did you enjoy trick-or-treating, parties, or handing out treats to the neighborhood? Leave me a comment. 

    Kris Freedain

    About Kris Freedain

    After spending 15 years in the Support and Support Operations organizations, he moved to a Marketing Role, and is now responsible for the corporate channel messaging for Dell Software.

    View all posts by Kris Freedain | Twitter

  • Statistica

    Thought Leaders in the Mix - Oct. 2015

    Our subject matter experts for Statistica keep busy, staying abreast of current trends with big data and small data, predictive software and real-world analytics solutions. And they frequently comment in industry publications, professional forums and blogs--or produce videos, in some cases. Here are a few of their recent articles.
    Danny Stout Dell Statistica Dell Big Data Solution Offers Predictions For A Business' Future
    featuring Danny Stout, senior analytics consultant

    In a demonstration for CRNtv, Danny Stout shows the simplicity of using Statistica to reduce customer churn—that is, to predict and reduce the likelihood of customers leaving a company's consumer base—while cutting costs and building revenue. Danny further describes the value of Dell's end-to-end solution that comprises hardware, software, and services.


    Dell Statistica John Thompson Dell’s Acquisition-Driven Analytic Reformation
    featuring John K. Thompson, general manager for advanced analytics

    John Thompson recently described to Datanami how the internal data migration to Statistica from “a legacy software provider in North Carolina” has quickly reduced non-standard KPIs by 50 percent and synchronized 99 percent of incoming information with existing records. Impressive!


    Dell Software Joanna Schloss Powering Student Learning with Data Analytics
    by Joanna Schloss, business intelligence and analytics evangelist

    Schools and teachers have long scrambled to keep up with the progression of educational tools. But without adequate support, pushing technology into the classroom becomes a source of frustration for teachers and a disservice to students. Joanna Schloss addresses the recent report Dell published with THE Journal about the transformative effect predictive analytics and data management can have on K-12 education.

    Dell Information Management Group David Sweenor
    Analytics Migrations: The Spooky Side of Change
    by David Sweenor, analytics product marketing manager

    This Halloween, perhaps the creature you fear most is the 800-pound monster in your I.T. department commonly referred to as “change.” However, when it comes to migrating from one data analytics system to another, Dell is an experienced monster-killer. Dave describes how we reduced the fear of change during our own internal migration, so now the only scary thing is the thought of going back to that legacy system.




  • Information Management

    Outgrowing Your Analytics Platform? Don’t Fear the Right Fit

    I love trick-or-treaters. As soon as dusk hits, our neighborhood gets swarms of cute little kids in adorable costumes, delighting us with their smiles as we drop a few pieces of candy into their pumpkin-shaped pails. As the night goes on, however, the kids get older,

         Photo Credit:  Steven Depolo Licensed under CC BY 2.0

    more demanding and less creative with their costumes – you get the teenagers wearing pajamas asking for an extra handful and then (sometime around 10PM or later) you get very tall “kids” in street clothes wearing Friday the 13th masks banging on your door, demanding your king-size Snickers bars.

    Do I tell them that they may have outgrown the trick-or-treating stage? Nah. I toss a few candy bars through the door and hope they’ll move along. Someone will eventually tell them. It just won’t be me.

    Are you dressing up your analytics tool for non-analytics tasks?

    In my last two posts in this platform migration series, I talked about how change can be a scary thing and how the process involved can seem just as frightening. But trust me, some of the technology challenges we faced during our own migration were also pretty disturbing.

    Much like those king-size “kids,” at Dell we discovered that we had outgrown a well-known legacy analytics platform. We had also just acquired our advanced solution, Statistica – an easier-to-use analytics platform – and needed to be able to tell prospects that we use the product we are promoting. So we embarked on a great migration from a very expensive and dated analytics platform to Statistica.

    During the migration, our team found that a number of analytics users at Dell were using the old product to manage and manipulate data before analyzing it. Sure, it could do the job but it was a really expensive way to move data around. And with all of the legacy code that was required to push data, it was unwieldy and unmanageable. 

    Unlike those grown-up trick-or-treaters, we quickly got out of denial and decided to do something about it.

    How using the wrong tool for the job can haunt you

    Using the wrong tool for the job may not sound like a big deal, especially if it gets the job done all the same. But in our case, it wasn’t just about using a tool with extraordinary analytics capabilities for ordinary jobs — it was also costing us licenses that didn’t need to be tied up on data management and data manipulation tasks. We couldn’t really blame our users for doing what they were likely taught by other Dell users at the time. But, it was an extremely expensive way to perform relatively common functions. And as an enterprise software company, we should know better.

    So how did we overcome our fear of the unknown and commit to something that kind of scared us? We had to separate analytics from data management at the software level and at the organizational level. We moved data management and manipulation to our Toad Data Point solution, and analytics and modeling to Statistica. With each team empowered with the right tools, we now have data integration experts focused on data management and analytics experts focused on analytics. Sure, there were some awkward moments during the transition. But once you figure out what really works for you, life is simply easier ― a lesson those teenage trick-or-treaters will learn soon enough.

    New e-book: The Great Analytics Migration

    If you’re using an expensive analytics software product dressed up in a data preparation costume or you’re facing a migration project of your own, read our e-book Statistica: The Great Analytics Migration. You’ll discover more information on how we successfully moved hundreds of users to a new platform in a matter of months, provided everyone with the right tools for the job and saved a ton in licensing fees. Follow our lead, and you can do it, too.

    David Sweenor

    About David Sweenor

    From requirements to coding, reporting to analytics, I enjoy leading change and challenging the status quo. Over 15 years of experience spanning the analytics spectrum including semiconductor yield characterization, enterprise data warehousing, reporting/analytics, IT program management, as well as product marketing and competitive intelligence. Currently leading Analytics Product Marketing for the Dell Software Group.

    View all posts by David Sweenor | Twitter

  • Information Management

    Lucky 13: New Version of Statistica Ups the Stakes for Predictive Analytics


    Each new release of Statistica builds upon our basic premise: Embedding analytics everywhere is the best route to better decision making. With Statistica 13, which we officially released in September and formally showcased to the world last week on the grand stage of Dell World, we’ve made it even easier to run any analytics on any data anywhere with new tools, deeper integrations and cutting-edge capabilities that push predictive algorithm model-building and scoring directly into the data source.

    Even more powerful analytics

    The latest improvements fit nicely into one of two buckets: General enhancements or new analytical capabilities. In the first bucket, the revamped GUI aligns with the most current Windows products to further our long-standing compliance with Windows standards. As a result, Statistica is more intuitive and easier to use than ever.

    Next, we’ve strengthened integration with the Statistica Interactive Visualization and Dashboard engine so the entire experience of conceiving, authoring and rendering visualizations is done in Statistica. Additionally, our integrated web server ability handles visualization rendering and management, so now analytic output can be easily distributed globally for greater collaboration and information sharing.

    Customer-driven enhancements

    About 90 percent of our new analytical capabilities came directly from recommendations by our world-class user community. Some, like our new Statistica stability and shelf life analysis as well as web data entry, were suggestions from pharmaceutical companies that appreciate our flexibility in addressing their specialized industry requirements.

    Others, such as in-database and in-Hadoop analytics strengthen our leadership in big data predictive analytics by bringing math to the data. With this enhancement, data consumers can build an analytical model in Statistica, click one button and then deploy in a Hadoop cluster. This is great for organizations that stack a lot of data in a big data environment and want to get at it with predictive analytics.

    Something for everyone

    For data scientists, we’ve added stepwise modeling, lasso regression and tree clustering. And a new ability to mine Chinese text broadens our international scope as Statistica now is available in 12 languages.

    Now not only do we speak more languages, we’ve expanded our community of analysts with in-database processing that enables them to run correlations on full-volume data. What we’ve done is decompose elements of algorithm formulas and simulations so they can be run directly in SQL, Oracle and Teradata—really any number of OLE or ODBC databases. What this means for our customers is they can better leverage big data investments while enabling people other than data scientists to run correlations. This extends the universe of users dramatically as anyone from a summer intern to a business analyst can run very sophisticated analytics without concerns about complex data management and sampling tasks.

    Analyzing data ― even big data ― right where it lives

    Processing data where the data resides, whether that’s a database or a Hadoop cluster, is another important part in Dell’s ongoing evolution to make Statistica accessible to a wider audience. We took another step in that journey in Statistica 13 with new Native Distributed Analytics (NDA) capabilities that integrate with Dell Boomi to transport analytic models anywhere in the world.

    Dell Boomi is Dell’s integration platform that lets organizations connect any combination of cloud and on-premises applications without software or appliances. So, we applied this incredibly cool technology to let users run analytics directly where the data actually lives. It makes perfect sense. Take a model like a neural net and sent it across the network in the size of an email with an attachment. This lightweight model then is run against the data in a SQL database and results are returned quickly and efficiently.

    With Boomi, we can take analytics to the data in a highly secure, efficient manner. This capability, which is unique to Dell, is designed to make analytics more accessible and available. That’s also part of our goal in extending the work we started with open source R and Azure ML last year. We’re making strides in collective intelligence to open our platform further and enable people to bring in models from all over the world to solve the toughest business problems.

    How will Statistica 13 simplify your work?

    Statistica 13 deals customers a winning hand with improvements, enhancements and new integrations that illustrate our continuing focus on lean-forward technology. If you’d like to learn more about how Statistica can simplify your work, check out our upcoming webcast, What’s New in Statistica 13.

    What new features in our “lucky 13” release will help your organization do more with your big data predictive analytics? Connect with me on Twitter at @johnkthompson60 to share your thoughts.

  • Information Management

    Powering Student Learning with Data Analytics

    Technology is always knocking on school doors:

    • Blackboards gave way to white boards, which paved the way for smartboards.
    • Calculators took over from slide rules.
    • PCs displaced typewriters for writing term papers.
    • The Worldwide Web overtook encyclopedias as the source for research.
    • Smartphones and tablets are squeezing out pens and paper.

      Click to enlarge

    Schools and teachers have long scrambled to keep up with the progression of tools for educating students. The knowledge gap is a given, when you consider that students outnumber teachers dozens-to-one and adopt new technologies quickly.

    Students are taking tests on tablets, studying (and collaborating) with peers online and answering questions in real time on documents in the cloud. Teachers are more accustomed to working with pen and paper, but the facts on the ground are moving them to mobile devices just so they can keep up with students.

    Technology is good. Support is better.

    Students learn new technologies at recess almost daily. How do teachers make time for their own technology training? You can push technology into the classroom, but until there is adequate support, it becomes a source of frustration for the teachers and a disservice to the students.

    Teachers are aware of the gap between the tools and technology on one hand and the support they need to use them effectively on the other. To create the infographic Powering Student Learning with Data Analytics in K-12, THE Journal surveyed decision makers on the use of data in K-12 education and quantified that gap:

    • 67% of teachers lack time to work with data for the sake of student learning.
    • Nearly half of teachers (49%) and a third of staff (33%) lack the skills for data analytics and data management; 21% say they receive no training to change that.
    • Budget is a barrier for 45% of districts and schools. 29% cite a lack of student computing devices as the obstacle.

    Predictive analytics in K-12 education

    Together with THE Journal, we’ve published a report called Game Changer: How Predictive Analytics is Transforming K-12 Education. Read it for more insights into using data analytics and data management in education.

    How do you think we can bridge the technology gap? Let me know in the comments below.

    Joanna Schloss

    About Joanna Schloss

    Joanna Schloss is a subject matter expert in the Dell Center of Excellence specializing in data and information management. Her areas of expertise include big data analytics, business intelligence, business analytics, and data warehousing.

    View all posts by Joanna Schloss | 

  • Direct2Dell

    Statistica and Medicine: Q&A with Dr. Steven Melemis

    In this post, I share highlights of a fascinating conversation with Dr. Steven M. Melemis, a long-time user of Dell Statistica and recognized expert in the field of addiction medicine. After earning a Ph.D. in statistics, Melemis went back to school for a medical degree and has merged his two academic pursuits to improve patient care and streamline emergency room operations.

    Q: When did you first recognize the data/healthcare connection?

    Given my background in statistics, I saw a long time ago that data drives almost all of our decisions. So if we can better analyze data, we can provide better patient care. The challenge in healthcare, however, is that many physicians suffer from a fear of statistics, especially now when the massive volumes of data can be intimidating.

    Q: So, what’s the best way to help physicians conquer that fear?

    While I enjoy analyzing complex data sets, I realize not everyone wants to dig that deeply. Still, you can learn a lot from simple data visualization. I tell people to look at the charts—they’ll tell you what you need to know. Statistica’s quick-start data mining recipes, analytic workflow templates and out-of-the-box analysis capabilities make it much easier to gain meaningful insights.

    Q: When did you first start using predictive analytics software?

    In my early academic career, I used statistical software that ran on mainframes. During my post-doctoral fellowship in the 1990s, however, I wanted to find something I could use on a PC. That’s when I first found Statistica—and I’ve been using it ever since.

    What I like best about Statistica is it works across the entire spectrum of predictive analytics. While it features sophisticated tools, it’s also suitable for people who just want to draw a few graphs and really look at their data in different ways. When data is more accessible, you can put capabilities in the hands of the people doing the actual work and empower them to see things they didn’t see before.

    Q: How has predictive analytics transformed alcohol withdrawal treatment?

    It has dramatically improved how patients are monitored in emergency rooms and rehab settings. For years, a classic tool called the Clinical Institute Withdrawal Assessment (CIWA), was used to assess whether patients needed treatment for alcohol withdrawal. The labor-intensive CIWA required a nurse to monitor a patient by measuring 10 different variables every hour. The problem: Some people didn’t get the entire test because there simply wasn’t enough time or resources, and occasionally, patients fell through the cracks.

    I knew there was valuable insight buried in the data, so I digitized a small core of CIWA measurements for 100 patients and fed it through Statistica—and one variable stood out! Determining if a person needs withdrawal treatment can be accurately predicted 75 percent of the time by simply determining if the person is sweating or not. Adding one more variable—perceptual disturbances—and the predictive accuracy jumped to 90 percent.

    I took that information to the hospitals in Toronto, where I live and work, and also shared it with leading rehab programs. The bottom line: Treating patients for alcohol withdrawal has become faster and more efficient. All from a simple yet powerful data analysis performed by Statistica.

    Q: How has predictive analytics changed your approach to addiction recovery?

    My approach is best summed up in an article I wrote recently for the Yale Journal of Biology and Medicine where I outline relapse prevention and the Five Rules of Recovery. Predictive analytics have shown me that people can greatly improve their chances of recovery if the follow a few simple guidelines.

    By empowering physicians of all kinds to take advantage of valuable data analytics, we can make major strides in personalizing healthcare and treatment plans while lowering healthcare costs and elevating patient care delivery.

    And, the best part is that you don’t need to have a Ph.D. in statistics. All you need is an easy- to-use tool like Statistica to simplify big data analysis and improve critical decision making.

  • Information Management

    Will Dell Statistica Ever Win a Grammy®?


    Subscribers to Statistica's monthly newsletter saw a headline last week mentioning Grammy-winner John Mayer alongside a reference to Statistica 13. Okay, I have to admit, when Dell first announced that Mayer would be at Dell World, I honestly had no idea who he was or why I should care. It turns out that he is a famous guitarist, if you believe everything you read on the Internet (!), and he will be performing at Dell World, the company's annual, global product solutions showcase.

    I don't say any of this to belittle Mayer at all, but merely to highlight for you my own ignorance. Just because I had never heard of Mayer doesn't mean he isn't great at what he does. In the same way, just because organizations around the planet maybe haven't yet heard of every Dell Software solution doesn't mean that we aren't the best at what we do. What it does mean is that, like Mayer, we must keep spreading the word, doing our best to get our name and products in front of wider and wider audiences, attracting the attention of competitors and champions alike.

    Mayer has done well for himself in this regard, and so has Dell Software. In addition to bringing John Mayer to the Dell World stage in October, Dell Software will also officially roll out Statistica 13 at that same event, and we couldn't be more excited! Our Statistica subscribers already read some product hints in the September newsletter (you, too, can subscribe for FREE), but we are keeping the details fairly under wraps until October 20. You may rest assured that, true to Statisica's long legacy, we are working hard to expand our worldwide audience, driving our advanced analytics platform with three defining concepts:

    1. Embed Analytics Everywhere: We want to enable you to run any analytics on any data anywhere to drive better decisions across your organization without breaking the bank.
    2. Empower More People: We improve collaboration between data scientists, business analysts and business users within a single workbench.
    3. Innovate Faster: With Statistica, you can quickly and easily uncover hidden opportunities by analyzing all data, streaming or at rest.

    Will Statistica ever win a Grammy? Maybe not, but Statistica does continue pulling in industry recognition from some very credible sources (for instance, here and here), and we do expect this trend to continue as more and more businesses make the switch to Statistica. It’s like the Grammys, but different.

    Meanwhile, have you registered yet for Dell World or for the Dell World Software User Forum? We want you to be there LIVE for the big Statistica reveal. And maybe you can meet John Mayer, too. I hear he's pretty good.


  • Information Management

    Analytics Migrations – The Spooky Side of Change

    It’s almost Halloween: the one night of the year when everyone becomes something different. As we approach an evening of haunted houses and creepy masks, I can’t help analyzing what it is that really gives us that little shiver of fear. And I think the answer is change. Because when you break it down, Halloween is really an entire holiday built around sudden transformation and the horror that can transpire.  

    Take, for example, your neighbors. You’re used to seeing that khaki-clad soccer dad next door waving hello as he starts his minivan each morning. But come October 31st, when he’s swinging open his front door to greet the now-horrified trick-o-treaters staring up at his Ozzy Osbourne makeup, a fake bat swinging from his mouth, there’s bound to be a few screams.  

    But it’s possible that if you’d always known that guy, not as a Ned Flanders clone, but simply as the wacky old rock star next door, you wouldn’t even flinch at his proclivity for joker makeup and bat snacks. It’s the epic transformation that really throws you off. Or maybe it’s just seeing a grown man pretend to eat a bat that does that. In any case, change is a frightening thing, and it can send people running for the hills.

    Confronting the 800-pound monster

    So, I have to admit, when I heard Dell wanted to take on a migration project to move hundreds of our employees onto a new advanced analytics platform in just six months, I couldn’t help worrying that our office might turn into its own house of horrors.

    Would the normally reserved analyst in the corner cube leap out screaming that he didn’t want to lose all the functionality he’d spent years building into our legacy system? Would angry mobs take to the halls in protest?

    We needed a way to not only calm everyone’s nerves, but more than that, we needed a way to get everyone excited about something new and different. So how did we do it ― and why?  

    Let’s start with the last part of that question. When you’re using an 800-pound monster of an analytics solution, you start to wonder if there isn’t a better way. From scary-high licensing costs, to the need for so many of our analysts to transform into coders, the prospect of maintaining the status quo began to seem more frightening than overhauling the entire system.

    Meanwhile, we’d just added an advanced analytics product called Statistica to our solutions portfolio. And as we became more and more familiar with its predictive analytics capabilities, we were blown away with what it could do.

    Suddenly, we didn’t just want to sell this thing, we wanted to use it ourselves. The fact was, Statistica was just a more affordable and robust analytics platform than the one that was bleeding us dry.

    And while those of us who’d seen it in action were sold, we still had to sell it to our employees. So we got together and made a plan that would help us properly anticipate and alleviate everyone’s fear of change.

    Making change less frightening

    We first had to prove that all the work users had done with the legacy product wouldn’t be discarded. So we sat down and showed them how everything could be replicated and enhanced in Statistica ― while eliminating the need for coding. Once they saw how they’d be able to leverage the same functions and make them better while doing less work? Well, let’s just say we had everyone’s attention.

    After we’d communicated the change and addressed everyone’s biggest fears, we gave people early access to Statistica. They began to see how this new tool better aligned with their tasks, and they started to shift their emotional connection away from the legacy platform.

    To keep things exciting and build on our momentum, we launched a contest. This inspired teams to get up and running quickly with Statistica, and it was fun to watch everyone build real-world models with their new toolset. It didn’t take long before users were embracing Statistica.

    When all was said and done, we’d successfully migrated hundreds of users onto Statistica in just a matter of months. I guess that says a lot about how cool the product is. But it also says a lot about how people can embrace change quickly, if it’s really for the best.

    And seeing how happy our users are today, I know it was the right thing. I’m not saying I wasn’t scared at first myself. This was a daunting project. But with a little creativity, we took something that could’ve been a nightmare and turned it into something to celebrate.

    Now that I’ve seen just how cool change can be, I’m looking forward to this Halloween. Because the scariest thing I can think of isn’t living next door to a Marilyn Manson impersonator for the night – it’s the thought of going back to our legacy system. Now that I don’t have to worry about that, I’m ready for some fun. 

    New e-book: The Great Analytics Migration

    In our new e-book, Statistica: The Great Analytics Migration, we explain how we created a plan for managing the people, processes and technologies involved in making our analytics platform switch. To learn more about how we migrated hundreds of users to a powerful analytics platform that reduces costs and coding needs, check out our e-book today.

    We hope that our experience will make yours a lot less spooky.

    David Sweenor

    About David Sweenor

    From requirements to coding, reporting to analytics, I enjoy leading change and challenging the status quo. Over 15 years of experience spanning the analytics spectrum including semiconductor yield characterization, enterprise data warehousing, reporting/analytics, IT program management, as well as product marketing and competitive intelligence. Currently leading Analytics Product Marketing for the Dell Software Group.

    View all posts by David Sweenor | Twitter

  • Information Management

    The Changing Face of Data Analytics [New E-book]

    If your company’s data analytics function went to the doctor for a checkup, would it come out with one of these diagnoses?

    A severe case of hyper-expectations

    Data scientist deprivation syndrome

    Technology deficit disorder


    The advanced analytics function is barely the age of a toddler in most organizations, yet the stress is already beginning to show. You can’t fault it, really; analytics itself is changing almost as fast as the data is arriving from your customers, transactions, connected devices, industrial machinery and supply chain.

    It’s like PCs in the early 1980s: We knew we needed and wanted them, but it took a while for us to figure out how to make the best use of them. Our needs changed as fast as the technology changed.

    The changing face of data analytics

    In our new e-book, Break Down the Barriers to Better Analytics, we look at how the changing face of analytics is moving faster than organizations themselves, leading to three symptoms:

    • A severe case of hyper-expectations. What is management asking for, now that the sources and tools for big data are in place? Management wants a data-driven business that understands customers, discovers new opportunities for innovation, makes better decisions and gets everyone in the company using analytics. That’s asking quite a bit.
    • Data scientist deprivation syndrome. In the scramble to hire data scientists who can help your company meet those hyper-expectations of data analytics, numbers are going uncrunched and opportunities are going untapped. Where does the skill set come from? Where are the analytics executives who can apply analytics and make you money? (Hint: Few of them come from IT.)
    • Technology deficit disorder. In several dimensions, your data analytics technology needs to keep pace with the growth in your data — unstructured data, data moving through the cloud and data in motion, to name just a few of those dimensions. The most effective way of closing that technology deficit is to apply analytics where the data resides, instead of moving data to where your analytics reside.

    New e-book: Break Down the Barriers to Better Analytics

    Have a look at our new e-book, Break Down the Barriers to Better Analytics, for more insights into the changing face of analytics; data preparation and data blending; and the corporate- and IT-centered barriers to using analytics efficiently in your organization.

    And be sure to read it before your company’s advanced analytics function goes to the doctor for a checkup.

    David Sweenor

    About David Sweenor

    From requirements to coding, reporting to analytics, I enjoy leading change and challenging the status quo. Over 15 years of experience spanning the analytics spectrum including semiconductor yield characterization, enterprise data warehousing, reporting/analytics, IT program management, as well as product marketing and competitive intelligence. Currently leading Analytics Product Marketing for the Dell Software Group.

    View all posts by David Sweenor | Twitter

  • Information Management

    New Statistica User Forum Brings On the FUNKY!

            Photo credit Allesandro

    In its long legacy, Dell Statistica has ranked very favorably for user satisfaction all over the world, as evidenced by survey results (e.g., Rexer) and customer testimonials.

    One way to keep customers happy with Statistica, of course, is to provide great support! We like to share our knowledge and make it easy for customers to engage not only with our internal subject matter experts but also with each other. In fact, our newsletter subscribers were recently provided with a comprehensive list of support tools just waiting for use. If you haven't seen that list before today, you might just want to subscribe for free to our newsletter.

    Keeping all this in mind, what do you suppose the following subjects have in common?

    • Mu tree node
    • stepwise regression
    • group analysis with boosted trees in workspace
    • neural network time series regression
    • left join in Statistica query
    • Statistica mapping capabilities

    …and my personal favorite:

    • Funkction* of regression bands

    If you guessed these are all topics about which Statistica users have asked questions in our new User Discussion Forum, then you guessed right. Our support team monitors the forum to provide answers and feedback, and you can also engage with other registered users who have selected to receive email notifications of forum activity.

    • Q: What Statistica-related topics or suggestions are on your mind?
    • A: Start a thread in the user discussion forum!

    Please note that the only requirement for you to participate in the forum is to register first in the free TechCenter community. This is a different ID/password combo than is used for standard Dell Software Group website access. The community registration page is accessed via the “Join” button in the top right corner of the community header at the forum page. By joining, you gain access not only to the Statistica forums but to the broader Dell TechCenter community, as well. Welcome to Funky Town!*

    Visit the Statistica User Discussion Forum >

    * Yes, that's the delightful way the asker really spelled it in our forum. Apparently, Statistica brings the funky to data analysis and predictive analytics! And now, like me, you can have the catchy Lipps Inc hit Funky Town running through your brain for the rest of the day.

  • Information Management

    The Timing Belt in our Great Analytics Migration [New E-book]

           Photo Credit: Ernesto Andrade Licensed under CC BY 2.0

    Imagine that you’re in the middle of an analytics migration project (as Dell was).

    Hundreds of users and projects all over the world are in transition from your legacy analytics product to the new one (as ours were).

    Everyone is heads-down-focused on following vast, detailed project plans with a jillion moving parts (as we were.)

    Suddenly, out of nowhere, an opportunity to fix Something Else swoops onto the scene (as it did onto ours).

    That Something Else is kind of a mess, and this would be the ideal time to fix it, but everybody around you is urging you to focus, focus, focus on migrating projects and users. The Something Else has to do with the tools and processes on either side of the analytics function. It’s not exactly the same issue as replacing your company’s advanced analytics product, but it’s closely related.

    What do you do: stay focused on your original project or devote some cycles to dealing with the Something Else?

    ETL, data extraction and reporting. And the timing belt.

    At Dell, we were waist-deep in migration from a well-known analytics product we had used for decades (you can probably guess which one) to Statistica, a product we had recently acquired. As I posted a few weeks ago, our migration project team discovered that a lot of people were using a Ferrari to haul dirt – that is, using a powerful analytics tool just for data manipulation – so we made some organization and tool changes as part of the migration.

    But the Something Else we discovered was that people were using dozens of tools for some of the main functions around analytics:

    ETL (Extract Transform Load) Process Automation – Microsoft SQL Server Integration Services, Microsoft Visual Studio

    Data Extraction – Microsoft SQL Server, D3’s JavaScript library, Adobe Site Catalyst

    Reporting – Microsoft SQL Server Reporting Services, Microsoft Access

    We had the opportunity to consolidate or replace these and stop the tool-creep, and it seemed as though we’d never have a better time to do it. Once everyone saw the inefficiency, the whole migration team wanted to deal with it, but it wasn’t part of the original plan.

    It was like the timing belt story:

    “Well, ma’am, your car has 90,000 miles, so we should replace the timing belt. And while we’re in there, we’ll have everything apart, so if there’s a problem with your water pump or the tensioner or the front cover gasket or the seal, that’s the best time to take care of it.”

    It’s tough to bite that bullet and deal with the Something Else. But you know if you don’t deal with it and you have to go back in again later to fix it, you’ll kick yourself.

    Actually, you won’t have to, because your boss will do the kicking for you.

    The Great Analytics Migration – new e-book

    So what did we do at Dell?

    We went the extra mile and did the consolidation. It’s the kind of company we are: we can’t look at an inefficiency and not do something about it. Our companywide Business Intelligence Council ran a survey that found dozens of tools at work. The council identified seven of the top ten of those additional tools for migration to appropriate Dell technologies. We’ll get to the rest of them eventually.

    Should you migrate users from other tools in the same project? We can’t tell you how to make that decision for your company, but we’ve put together an e-book, “Statistica: The Great Analytics Migration , Part 3: Technology, that tells you how we made it at Dell. Read the e-book for unique insights into how we managed our migration. We know quite a bit about it.

    Just don’t ask us about your timing belt.


    David Sweenor

    About David Sweenor

    From requirements to coding, reporting to analytics, I enjoy leading change and challenging the status quo. Over 15 years of experience spanning the analytics spectrum including semiconductor yield characterization, enterprise data warehousing, reporting/analytics, IT program management, as well as product marketing and competitive intelligence. Currently leading Analytics Product Marketing for the Dell Software Group.

    View all posts by David Sweenor | Twitter

  • Information Management

    Hippocrates Would Have Liked PAW-Boston Chowder

    While I certainly appreciate Boston for its history, chowder, and marathon, it is the predictive analytics scene that keeps bringing us back year after year. I know that sounds odd, but the annual Predictive Analytics World (PAW) Boston event is a natural fit for Statistica, especially with the recent development of a predictive healthcare track. 


    Healthcare's connection to predictive analytics arguably extends back to the ancient Greek physician, Hippocrates of Kos, who supposedly provided the instruction, “Declare the past, diagnose the present, foretell the future.” And if that isn't data-scientist-speak, I don't know what is! Hippocrates also touted the medicinal value of food, so I have no doubt he would have prescribed Boston clam chowder for its palliative effects, though I suspect he had his fill of seafood in his time (several hundred years before the birth of Christ).

    Back then, of course, the healthcare system--if it could be called such--was comparatively simple, perhaps limited primarily to individual doctor-patient relationships. That simplicity is no longer the norm. During Statistica's mere 31-year legacy, our customers have driven us to develop expertise that guides healthcare organizations through the necessary components of data management and reporting, patient analytics, insurance risk reduction and regulatory compliance. You can learn about some of our healthcare successes in our datasheets, white papers, and videos

    So, when it comes to targeted events like PAW-Healthcare in Boston, we get to be all over the place. Our newsletter readers (yes, you can subscribe for FREE) already received a short list of our PAW-Healthcare exposure, where we will be sharing our expertise face-to-face with modern-day physicians and data scientists at breakfasts, meetups, and presentations. Take a look here and then be sure to register for PAW-Healthcare yourself.

    We will also maintain a presence at booth #240, so we hope to see you there the week of September 28.

  • Information Management

    Onboarding Users and our Great Analytics Migration [New E-book]

    You can lead a horse to water, but you can’t make it drink.

    Image credit: Greg Westfall | Licensed under: CC BY 2.0

    If you’re going to spend months putting that water in place by migrating to a new analytics platform, you’d better build a process for onboarding users smoothly so that they drink. Otherwise, you’ll end up with a lot of people looking like the kid in the photo and reciting the caption to you.

    I mentioned in my previous post the migration project we underwent here at Dell to move off one of the world’s best-known legacy analytics products and onto Statistica, an analytics platform Dell had recently acquired. How do you onboard users throughout the project when you make a fundamental switch like that?

    Where does your user onboarding process live?

    Who manages user onboarding in your organization? Usually, the onboarding process lives in IT, which is where it used to reside at Dell. It wasn’t perfect, but we lived with it that way for a long time, along with three burdensome restrictions:

    Finite number of licenses: It’s hard to onboard new users when you have a limit on licenses for your analytics software. We had to ask IT for more licenses, and they had to tell us none were available.

    License swapping: De-activating and re-activating licenses to move them between data analysts was a drag, but as keeper of the software keys, IT had to be involved.

    Doling out licenses carefully: On the rare occasions when licenses were freed up, people had to lobby IT for access to them.

    In fact, it took the migration project to finally break this process, and that’s when we moved it out of IT.

    The flexible licensing model of Statistica allowed us to change the focus of onboarding users from IT to self-service in the business units themselves. Then, we made an organizational change so that the Business Intelligence Center of Excellence in each business unit made and managed its own strategy for allocating access to Statistica.

    That pushed the unwelcome variable (IT’s response time) out of the migration project and kicked off more-efficient onboarding for everyone. Internal customer satisfaction went up when users saw that getting access would be easier in the future than it had been with the legacy product.

    The Great Analytics Migration – new e-book

    We’ve written a new e-book called “Statistica: The Great Analytics Migration, Part 3: Technology.” Read it for an idea of how we at Dell handled the migration from one of the world’s best-known analytics products (you can probably guess which one) onto Statistica while allaying users’ concerns about migration and access.

    If you embark on a migration project, whether for analytics or any other companywide function, you’ll need to think about making the user onboarding process palatable.

    After all, the kid in the photo is cute for a minute or two, but you don’t want your co-workers looking at you like that for months on end.

  • Information Management

    Data Manipulation and our Great Analytics Migration [New E-book]

    “Why would you use a Ferrari to haul a load of dirt?”

    Yeah. Why would you?

         Photo Credit:  Falk Lademann

    You wouldn’t, of course, at least not knowingly. But a few months into the Great Analytics Migration I described last month, our migration team found analytics users at Dell who had been doing the equivalent for years. That’s when the question about using a Ferrari as a dump truck started making the rounds.

    Better-fitting tools for data management and manipulation

    The “Ferrari” was a well-known legacy analytics software product designed to run on mainframes back in the 1970s. (You can probably guess which one it is).

    It happens that the product includes tools for data management and data manipulation, so our users became accustomed to using the Ferrari, with its high licensing fees and remarkable analytics capabilities, for “hauling loads of dirt;” that is, manipulating data before analyzing it.

    In all fairness, most of the users were just doing what they’d learned from other users within Dell. And they weren’t ruining the Ferrari’s transmission or even scratching the paint. But as an enterprise software company, we have a line of products like Toad Data Point that cost less and are perfectly suited to the task of accessing and working with big data sources. And anyway, the entire migration project was about moving off the well-known legacy analytics software product and replacing it with Statistica, an easier–to-use analytics platform that we had acquired.

    So using a Ferrari to haul a load of dirt was costing us licenses that didn’t need to be tied up on data management and data manipulation tasks. It’s an extremely expensive way to perform relatively common functions.

    Better yet, as we separated analytics from data management at the software level, we also separated them at the organizational level. In the course of our migration project, we moved data management and manipulation to Toad Data Point, handled by data integration experts, and analytics and modeling to Statistica, handled by analytics professionals. That has put each team is in its respective wheelhouse.

    The Great Analytics Migration – new e-book

    Are you by any chance using an well-known legacy analytics software product to manipulate your data? If so, then some of your users are probably using a Ferrari to haul dirt.

    That may be all right with you, but if it isn’t, have a look at “Statistica: The Great Analytics Migration , Part 3: Technology” to find out how we switched analytics platforms worldwide in a matter of months.

  • Information Management

    Coming Out on Top – Toad and SharePlex Take DBTA Readers’ Choice Awards

    Thanks for making Toad and SharePlex #1 again! In the 2015 Readers’ Choice Awards sponsored by Database Trends and Applications Magazine, Toad Development Suite was named Best Database Development Solution, Toad DBA Suite was named Best Database Administration Solution and SharePlex was named Best Streaming Data Solution.

    More than 30,000 data professionals read DBTA, and that distinguished following recognized Dell products as winners in three categories and finalists in nine more of the 29 total categories. Having real-world users award us winner or finalist in almost half of all categories is quite an honor, and we’re gratified that you think so highly of our work.

    Organizations are becoming increasingly data-driven and look upon data as a competitive advantage. It’s no surprise that so many pros rely on Toad to help them meet their SLAs by being more productive, improving performance and ensuring high-quality applications can be delivered faster in their RDBMS and NoSQL environments.

    SharePlex got the nod from DBTA for extending the value of existing systems by integrating with modern systems, making data available to users in real-time and enabling active reporting and fast decision making.

    Dell’s Winners and Finalists in DBTA Readers’ Choice Awards

    Read all about the Readers’ Choice Awards, including judging criteria and category details. Here is a list of the Dell products that were named winners and finalists in this year’s awards:


    Best Database Administration Solution – Toad DBA Suite

    Best Database Development Solution – Toad Development Suite

    Best Streaming Data Solution – SharePlex


    Best Database Backup Solution – LiteSpeed for SQL Server

    Best Change Data Capture Solution – SharePlex

    Best Data Modeling Solution – Toad Data Modeler

    Best Database Performance Solution – Spotlight on SQL Server Enterprise

    Best Business Intelligence Solution – Toad Business Intelligence Suite

    Best Data Mining Solution – Statistica

    Best Cloud Integration Solution – Boomi

    Best Query and Reporting Solution – Toad Data Point

    Best Data Storage Solution – Compellent

    That’s 11 unique Dell products chosen out of 367 products nominated from dozens of vendors. These awards appear also in the August 2015 print edition of Database Trends and Applications Magazine, which has 20,000 subscribers.

    Download your free trial version

    If you’re not yet using one of the winning products, see what all the fuss is about. Click on the following links for a free trial version:

    Toad DBA Suite for Oracle

    Toad Development Suite for Oracle


    Thanks again for making us number one! We have lots more on the roadmap for all of these products, so keep your eye on us for next year’s awards.

  • Statistica

    Thought Leaders in the Mix - Sept 2015

    Our subject matter experts for Statistica and the Information Management Group (IMG) keep busy, staying abreast of current trends with big data and small data, predictive software and real-world analytics solutions. And they frequently comment in industry publications, professional forums and blogs—or produce videos, in Shawn Rogers' case. Here are a few of their recent articles.
    Dell Software Group Thomas Hill Automated analytics can fill in for data scientists: But...(!)
    by Dr. Thomas Hill, executive director analytics, Dell Software

    One way to look at how predictive modeling technology will transform the healthcare sector is to compare it to other industries that were the earliest adopters—and automaters—of such methods. In this Health Data Management article, Dr. Hill asks whether healthcare data science and predictive modeling could be similarly automated? And what exactly would that look like?



    Dell Information Management Group Shawn RogersVideo clips from London's Cloud World Forum 2015
    by Shawn Rogers, chief information officer, Information Management Group

    Speaking recently at Cloud World, Shawn addressed the great business opportunity afforded by hybrid data environments, where the cloud presents an interesting convergence of technologies and capabilities that enable data processing from almost anywhere—often with the purpose of applying advanced analytics that lead to insights not previously understood.



    Dell Statistica John Thompson Thirst for Advanced Analytics Driving Increased Need for Collective Intelligence
    by John K. Thompson, general manager advanced analytics, Dell Software

    In his latest article at, John Thompson explains that the data scientist skills gap will not deter data-driven organizations from achieving the benefits of predictive analytics, thanks to their willingness to pursue collective intelligence as a practical, collaborative workaround that is powerful enough to "change the world."



    Dell Information Management Group John Whittaker
    Three Tips for Surviving Today's Complex Data Landscape
    by John Whittaker, executive director, product marketing, Information Management Group

    In this article contributed to Data Center Knowledge, John acknowledges that many organizations collect a disparate mix of structured and unstructured data, and he spells out three information management priorities for DBAs to maintain efficiency and achieve successful integration with analytics downstream.





  • Direct2Dell

    Opening Doors of Big Data Innovation with IT and Business Alignment

    Deriving value from big data is getting a lot easier, thanks to the continuing breakdown of both technologic and economic barriers. Many people would have you think that big data is a new concept, when, in fact, big data has been around a lot longer than most people think. It just used to take a federal grant to do anything with it.

    Now, however, technology has evolved to where it’s possible to analyze data at the speed of business more economically than ever before. This opens doors of innovation to a much broader swath of organizations that can use information to drive their businesses forward—faster, further and more competitively.

    But to cross the finish line, you need to answer the following questions:

    1. How can I transform my business using data, especially information I couldn’t address in the past?
    2. How can I do better, smarter things with data while driving operational efficiencies?
    3. How do I align technical and business people to fully utilize data and take our business to the next level?

    When I first got into this business, it was enough to figure out how many widgets were sold in a particular region. Now, companies want to know how many widgets were sold in a particular region, in a certain color, to a specific customer, 10 minutes ago. Or, even better, they want to be able to predict the result before it happens. This takes a different approach to information—one that requires IT and business people to be in lockstep before opening the data floodgates. As the saying goes, it takes a village.

    The extra effort to align is well worth it, however, as great things can happen when business and IT leaders are on the same page. At Information Laboratory, a leader in the development, manufacturing and distribution of medical devices, getting there meant giving research scientists and engineers ready access to a wealth of production test data. It also meant that both groups needed the ability to perform analyses of manufacturing, quality control and supply chain information to drive better quality and product innovation.

    With Dell Statistica, analysts throughout the company can help themselves without IT intervention. As a result, Information Laboratory has taken advantage of its organizational intelligence to streamline and improve manufacturing operations. They’ve accomplished this by quickly identifying and fixing any problems associated with producing hundreds of thousands of device cartridges containing a card with a variety of measurement sensors. The bottom line: Information Laboratory has saved hundreds of thousands of dollars by avoiding the need to scrap a single batch of sensor cards.

    Without technology, cost and organizational barriers, companies can drive innovation and deliver collective intelligence to those who need it most. This will be key to achieving success in a data-driven Internet of Things (IoT) world.  Despite what some pundits say, IoT is not a new trend, as machines have been pumping out data for a really long time. Companies like Information Laboratory and others in healthcare, manufacturing, and automotive have been working with sensor-generated data for years. What’s new is the level of connectivity between data sources and the availability of cost-effective technologies and analytics tools that enable companies to do more with their data.

    And, the more you can do with data, the better positioned you are to handle the massive scale required to integrate sensor-generated information with other digital data. Companies will demand the agility to execute analytics, as well as manage and secure it at the edge near the sensor as well as at the core of the IoT/data environment.

    Think about the improvements in healthcare decisions if patients can work more directly with their physicians to fill in information gaps. They can infuse EMR data with self-generated information from their fitness wearables and data gathered at home, such as exercise levels, walking heart rate, recent glucose readings, etc. Doctors can blend that information with vitals data gathered during office visits. Applying industry best practices to the EMR data enables physicians to offer personalized, more effective recommendations designed to improve patient outcomes.

    While the healthcare industry offers plentiful examples, any company that breaks down barriers can achieve measurable big data benefits. One major advantage at these organizations: No one is afraid of their data. Also, IT isn’t always the center of innovation as key business stakeholders—from sales, marketing, finance and customer support—are funding major projects.

    If you can marry your big data, regardless of where and how it’s generated, with crucial business processes, you’ll win. What’s your strategy for breaking down barriers and opening doors of innovation? Connect with me on Twitter at @shawnrog to share your story.

  • Information Management

    #ThinkChat– Patient Engagement… Is It the Answer or Just Buzz?

    Follow #ThinkChat on Twitter Friday, September 18th, at 11:00 AM PST, for a live conversation exploring the importance of patient engagement.

    Patient engagement can mean many things.  Engaging patients through new technologies can help facilitate better communication, education, and collaboration, resulting in better health outcomes.  When there’s an increase in meaningful physician-patient interactions, patients are more likely to get involved in their personal health care.  This may be just one step that can eventually lead to patient empowerment.  Join the conversation and provide input on your perspective of how patient engagement can lead to better health outcomes

    Join Shawn Rogers (@ShawnRog), Chief Research Officer for Dell IMG, Dr. Nick van Terheyden (@drnic1) Chief Medical Officer at Dell; Janice Jacobs (@JaniceJacobs44) Social Media Solution Leader, Dell Services, Stephanie Bartels (@Steph_bartels19) Global Solution Leader - Patient Engagement, Dell Services and Mandi Bishop (@mandiBPro) Health Plan Analytics Innovation Practice Lead, Dell Services for this month's #ThinkChat as we talk with the community about the impact of patient engagement.

    Join @DellBigData and share your own personal stories about patient engagement and follow #ThinkChat and #NHITweek!

    Questions discussed on this program will include:

    • There is a lot of talk about patient engagement. Is it all talk or are healthcare organizations investing?
    • What are the barriers to engaging people?
    • Patient engagement seems like a buzz word, we hear it a lot.  Is it a fad or is it here to stay?
    • What's your point of view on how patient engagement is defined?
    • Is there real ROI from engaging patients?
    • Doesn’t it cost a lot though to work with patients 1:1?  Do hospitals have the resources to do this?
    • What are the barriers to getting clinicians to do things differently with patients?
    • What can we do differently to be more successful with patient engagement strategies?
    • How can we leverage mobile technology to more effectively engage patients?

    Where: Live on Twitter – Follow Hashtag #ThinkChat to get your questions answered and participate in the conversation!

    When:  September 18th, at 11:00 am PDT

    Shawn Rogers

    About Shawn Rogers

    Shawn Rogers is Chief Research Officer for the Information Management Group at Dell Software. Shawn is an internationally recognized thought leader, speaker, author and instructor on the topics of IoT, big data, analytics, business intelligence, cloud, data integration, data warehousing and social analytics. Shawn has more than 19 years of hands-on IT experience. Prior to joining Dell he was Vice President Research for Business Intelligence and Analytics at Enterprise Management Associates a leading analyst firm. Shawn helps customers apply technology to fuel innovation and create value with data.

    View all posts by Shawn Rogers | 

  • Direct2Dell

    Postcards from the Edge of IoT Analytics

    Old postcards illustrate how Dell offers solutions for managing, securing and analyzing data from the data center to the farthest endpoint

    Late last month, I participated on a panel at the IoT Evolution Conference & Expo, entitled “Unleashing value from analyzing data generated by the Internet of Things.” Joining me were Syed Hoda, CMO at ParStream, and Laurie Lamberth, associate partner at 151 Advisors. Even though it was the last day of the conference, we had the good fortune to share insights with a standing-room only crowd eager to learn how real-time analytics could help generate more value from their IoT initiatives.

    It’s crystal clear that IoT can help companies drive significant operational efficiencies and business growth. The trick is figuring out the best way to address the rapidly rising numbers of sensors, embedded systems and connected devices, which are taking data volume and complexity to a whole new level.  

    A recent report from ABI Research estimates the volume of data captured by IoT-connected devices will surpass 1.6 zettabytes within five years. According to ABI, only a fraction of this data is currently being captured for further analysis because the vast majority is stored or processed locally without a way for it to be easily shared across the enterprise to aid decision making.

    Many companies are betting on fog computing to solve this problem by reducing the amount of local data that needs to be transmitted back to the cloud for processing and analysis. Bringing these functions closer to the data source will let companies extend the benefits of cloud computing to their network edge and for faster, easier and more meaningful business insights. That’s where edge analytics come in, as the ability to access time-sensitive, geospatial data opens the door for real-time analysis of data with increased accuracy and context.

    Edge analytics will help fulfill the promise of IoT and be a gating factor for scaling IT infrastructures to reliably capture, store and ensure accessibility to data generated by hundreds of billions and even trillions of devices. The sheer volume and complexity of managing all of this decentralized, localized data can quickly overload traditional environments and analysis tools.  

    Most legacy solutions haven’t been designed to ensure low-latency data access for geospatial workloads at the enterprise’s edge. A lack of protocol standards also complicates cross-domain data sharing while alignment challenges between IT and business stakeholders can quickly derail strategy development and implementation.

    That’s why we recommend architecting for analytics, as the success of any deployment will be tied directly to the quality of the insights gleaned. For many Dell customers, this means having the flexibility to deliver predictive analytics at the core while creating a path for performing data aggregation and scoring at the edge.

    For Smart Start, a Grapevine, Tex.-based leader in alcohol monitoring technology that manufacturers a line of ignition interlock breath analyzers, Dell devised an edge analytics solution for sending near real-time quality data from its production line. The goal: Increase product quality throughout the company’s supply chain. The solution: A multi-tier automation and data management system that collects data from each of the assembled products, analyzes it using custom algorithms, then aggregates it from multiple manufacturing sites into the cloud so the latest, most accurate details can be presented in reports and visualization tools.

    The key to the success of this—and any—deployment is using modular, architecture-agnostic solutions that scale quickly from pilot to production. Of course, bolstering security is critical as an exponential increase in connected devices introduces an exponential increase in security risks. Dell puts security first to ensure our customers don’t end up with the “Internet of Compromised Things.”

    At Dell, we practice what we preach with solutions for managing, securing and analyzing data from the data center to the farthest endpoint and along all the networks and clouds in between. We suggest starting small and building on current technology investments and real-world successes. Luckily, our customers are well positioned to take advantage of Dell’s end-to-end hardware, software and services framework to build secure, extensible, supportable, expandable and configurable IoT solutions today.

    What are you doing to make the Internet of Things real…today? How do you plan to deploy edge analytics and unlock greater value from your data? Connect with me on Twitter at @alertsource to join the conversation.

  • Statistica

    Using Dell Software Support for your Statistica products

    We invite you to take a few minutes to explore how the Dell Software Support Portal and a host of tools and capabilities are easily accessible to help you utilize your Statistica products and engage with its experts 24x7x365. From one central location, you will find everything you need, including:

    In the event you do need to contact technical support, you can submit a Service Request via the support portal for the quickest and most effective means of connecting with your regional support representative.

    Opening a Service Request online ensures:

    • Faster response times – Once we receive your request through the support portal, we’ll get back to you as quickly as possible with the assistance you need.
    • Streamlined case management – All of your essential details about your case are submitted online including attachments like log files, screenshots, reports, or trace logs.
    • Eliminates telephone hold-time – We ensure your case is routed to the most qualified engineer without having to wait on hold.

    We are also excited to announce the new Statistica User Discussion Forum, where savvy minds are invited to post content and questions about all things analytics. Our community forum is regularly monitored by Dell experts and peers to provide best practices, seek feedback, and make product suggestions. We look forward to your participation!

    For more information, visit the Dell Software Support Portal!


  • Information Management

    How a Hybrid Solution Addresses the Crisis of Data Speed

    Subscribers to our Statistica Monthly Newsletter have already been made aware of the latest EMA/9sight survey results that found data-driven businesses are becoming more interested in speed than volume. This shift in focus will change market dynamics, and that’s what the survey’s executive summary is all about.

    When you read the summary report, you will learn some interesting things:

    • Speed Is Driving Competition – Speed of processing response was the most frequently indicated use case by respondents, at nearly 20%.
    • Time to Value with Applications – Over 20% of respondents implemented big data projects using customizable applications from external providers.
    • Low Latency is High Profile – Big data projects are overwhelmingly near real-time, with over 32% described as real-time or near real-time processing of data.
    • Two-time Use Case Champion – For the second year in a row, the top use case for big data initiatives is speed of processing response, at nearly 20% of mentions.

    Clearly, a growing portion of respondents are feeling the need for speed.

    Of course, speed is intrinsically related to volume and data structure. It is because of the growing volume and variety of data—especially with the onset of the Internet of Things—that data collection and preparation now require extra attention so that time-to-value (i.e., speed) can be maintained or improved. It is also true that not every business is ready to roll with 100% all-new infrastructure (hardware + software + sensors + workflows) to handle all this change from Day One, which means that most—if not all—businesses are likely implementing their data-driven strategies in piecemeal fashion, with a mix of old and new technologies plus a wish list for more.

    This is a good place to mention the “Hybrid Data Ecosystem” as a valid means of addressing the speed issue. EMA originally defined the big data Hybrid Data Ecosystem (HDE) several years ago through end-user surveys and interviews with technology thought leaders, implementation specialists, and software vendor experts. Each platform within a HDE supports a particular combination of business requirements along with operational or analytical processing challenges. Rather than advocating a single data store that supports all business and technical requirements at the center of its architecture, the HDE seeks to determine the best platforms for supporting particular sets of requirements and then links those platforms together. In this sense, HDE makes the most of the messiness of reality and the overlap of various technologies that exist side-by-side in many businesses today.

    Let’s face it: conversions, migrations, and upgrades don’t happen overnight and usually involve transition periods that may last into perpetuity. Accordingly, the Hybrid Data Ecosystem is constantly refined. This year, for instance, EMA expanded the HDE scope to include the influence and impact of the cloud on big data environments and data consumers.

    You’ve simply got to see the comprehensive infographic that represents the latest HDE model, and read how Dell Software’s big data offerings (including Statistica) map to it. Click the image below to get to the report.

  • Information Management

    #ThinkChat – Real-Time Analytics: Myth or Right Around the Corner?

    Follow #ThinkChat on Twitter Friday, September 4th, at 11:00 AM PDT for a live conversation and discover how your peers are using real-time data and analytics!  

    The state of analytics is evolving fast and while more people within the business are utilizing and relying on the value that analytics presents, new demands for faster insights are stretching our traditional analytic infrastructure. Real-time analytics are an exciting opportunity for many companies. Do you have the architecture and tools you need to match the speed of the business?

    Join Shawn Rogers (@ShawnRog), Chief Research Officer for Dell IMG, and Joanna Schloss (@JoSchloss), Dell Software's Analytics Thought Leader and special guest Dean Abbott (@DeanAbb) Chief Data Scientist at SmarterHQ, for this month's #ThinkChat and talk with the community about  how real-time analytics is impacting your business.

    Join us and share your own personal stories about real-time data or analytics!

    The #ThinkChat Agenda Includes:

    • What solutions/tools do you or your organization use for real-time analytics?  What are your favorites?
    • How does real-time affect your day to day decision making?
    • Do you see a real need for real-time analytics?
    • What hinders you or your org from achieving real-time decision making with real-time data?
    • Promote your favorite sites/brands - where do you go to meet analytic professionals?
    • Do you have any favorite books or blogs on big data and real-time analytics?
    • What is your opinion on where or how real-time evolves? Where is it going from here?
    • Is your architecture ready to support real-time analytics?
    • Are there specific workloads that are moving you toward real-time analytics?

    Where: Live on Twitter – Follow Hashtag #ThinkChat to get your questions answered and participate in the conversation!

    When:  September 4th, at 11:00 am PDT

  • Information Management

    Drivers Wanted: Statistica User Forum Now Open

       Image credit: Pat Herman

    In the most recent issue of our Statistica Monthly Newsletter (yes, you can subscribe for free), our readers were made aware of a new Statistica user forum in our community pages. The new forum is intended to be a true user-to-user community, with discussion threads driven by the users, of the users, and for the users.

    The good news is you don’t have to be a Statistica guru to participate! However, this forum does provide you the opportunity to share your best practices, seek feedback on vexing challenges, make product suggestions and expound on specific analytics and data topics that interest you. You can promote yourself by linking to relevant blogs and articles you have written in other forums, too, such as LinkedIn groups. This totally new forum will be regularly monitored by Dell experts and peers alike, so you can anticipate that your posts will be addressed even as we build our new community audience from scratch.

    Why is this a big deal? Because the development of the Statistica platform itself is a response to your real-life use cases and the industry trends that affect you. And because you can improve your own knowledge base (and your personal brand) by collaborating with fellow Statistica users. You never know where the next big idea may come from. Here I will happily defer to greater minds than my own:

    • “Alone we can do so little; together we can do so much.”
      ― Helen Keller
    • “You can't develop all the competencies you need fast enough on your own. Furthermore, if you don't collaborate, your ideas will be limited to your own abilities.”
      ― Vishwas Chavan
    • “Many ideas grow better when transplanted into another mind than the one where they sprang up.”
      — Oliver Wendell Holmes
    • “Share your knowledge. It’s a way to achieve immortality.”
      — Dalai Lama (1357-1419)
    • “The only thing to do with good advice is to pass it on.”
      — Oscar Wilde
    • “All knowledge is connected to all other knowledge. The fun is in making the connections.”
      — Arthur Aufderheide

    The sun never sets on the Statistica empire, because there are over 1 million Statistica users in dozens of countries around the globe, in industry and academia and government. As a Statistica user, you are never alone. So share the forum link with your peers, and we look forward to your participation.

    For more information, subscribe to the Statistica newsletter >