If your company’s data analytics function went to the doctor for a checkup, would it come out with one of these diagnoses?
A severe case of hyper-expectations
Data scientist deprivation syndrome
Technology deficit disorder
The advanced analytics function is barely the age of a toddler in most organizations, yet the stress is already beginning to show. You can’t fault it, really; analytics itself is changing almost as fast as the data is arriving from your customers, transactions, connected devices, industrial machinery and supply chain.
It’s like PCs in the early 1980s: We knew we needed and wanted them, but it took a while for us to figure out how to make the best use of them. Our needs changed as fast as the technology changed.
In our new e-book, Break Down the Barriers to Better Analytics, we look at how the changing face of analytics is moving faster than organizations themselves, leading to three symptoms:
Have a look at our new e-book, Break Down the Barriers to Better Analytics, for more insights into the changing face of analytics; data preparation and data blending; and the corporate- and IT-centered barriers to using analytics efficiently in your organization.
And be sure to read it before your company’s advanced analytics function goes to the doctor for a checkup.
About David Sweenor
From requirements to coding, reporting to analytics, I enjoy leading change and challenging the status quo. Over 15 years of experience spanning the analytics spectrum including semiconductor yield characterization, enterprise data warehousing, reporting/analytics, IT program management, as well as product marketing and competitive intelligence. Currently leading Analytics Product Marketing for the Dell Software Group.
View all posts by David Sweenor |
Photo credit www.RGBStock.com Allesandro
In its long legacy, Dell Statistica has ranked very favorably for user satisfaction all over the world, as evidenced by survey results (e.g., Rexer) and customer testimonials.
One way to keep customers happy with Statistica, of course, is to provide great support! We like to share our knowledge and make it easy for customers to engage not only with our internal subject matter experts but also with each other. In fact, our newsletter subscribers were recently provided with a comprehensive list of support tools just waiting for use. If you haven't seen that list before today, you might just want to subscribe for free to our newsletter.
Keeping all this in mind, what do you suppose the following subjects have in common?
…and my personal favorite:
If you guessed these are all topics about which Statistica users have asked questions in our new User Discussion Forum, then you guessed right. Our support team monitors the forum to provide answers and feedback, and you can also engage with other registered users who have selected to receive email notifications of forum activity.
Please note that the only requirement for you to participate in the forum is to register first in the free TechCenter community. This is a different ID/password combo than is used for standard Dell Software Group website access. The community registration page is accessed via the “Join” button in the top right corner of the community header at the forum page. By joining, you gain access not only to the Statistica forums but to the broader Dell TechCenter community, as well. Welcome to Funky Town!*
Visit the Statistica User Discussion Forum >
* Yes, that's the delightful way the asker really spelled it in our forum. Apparently, Statistica brings the funky to data analysis and predictive analytics! And now, like me, you can have the catchy Lipps Inc hit Funky Town running through your brain for the rest of the day.
Photo Credit: Ernesto Andrade Licensed under CC BY 2.0
Imagine that you’re in the middle of an analytics migration project (as Dell was).
Hundreds of users and projects all over the world are in transition from your legacy analytics product to the new one (as ours were).
Everyone is heads-down-focused on following vast, detailed project plans with a jillion moving parts (as we were.)
Suddenly, out of nowhere, an opportunity to fix Something Else swoops onto the scene (as it did onto ours).
That Something Else is kind of a mess, and this would be the ideal time to fix it, but everybody around you is urging you to focus, focus, focus on migrating projects and users. The Something Else has to do with the tools and processes on either side of the analytics function. It’s not exactly the same issue as replacing your company’s advanced analytics product, but it’s closely related.
What do you do: stay focused on your original project or devote some cycles to dealing with the Something Else?
ETL, data extraction and reporting. And the timing belt.
At Dell, we were waist-deep in migration from a well-known analytics product we had used for decades (you can probably guess which one) to Statistica, a product we had recently acquired. As I posted a few weeks ago, our migration project team discovered that a lot of people were using a Ferrari to haul dirt – that is, using a powerful analytics tool just for data manipulation – so we made some organization and tool changes as part of the migration.
But the Something Else we discovered was that people were using dozens of tools for some of the main functions around analytics:
ETL (Extract Transform Load) Process Automation – Microsoft SQL Server Integration Services, Microsoft Visual Studio
Reporting – Microsoft SQL Server Reporting Services, Microsoft Access
We had the opportunity to consolidate or replace these and stop the tool-creep, and it seemed as though we’d never have a better time to do it. Once everyone saw the inefficiency, the whole migration team wanted to deal with it, but it wasn’t part of the original plan.
It was like the timing belt story:
“Well, ma’am, your car has 90,000 miles, so we should replace the timing belt. And while we’re in there, we’ll have everything apart, so if there’s a problem with your water pump or the tensioner or the front cover gasket or the seal, that’s the best time to take care of it.”
It’s tough to bite that bullet and deal with the Something Else. But you know if you don’t deal with it and you have to go back in again later to fix it, you’ll kick yourself.
Actually, you won’t have to, because your boss will do the kicking for you.
The Great Analytics Migration – new e-book
So what did we do at Dell?
We went the extra mile and did the consolidation. It’s the kind of company we are: we can’t look at an inefficiency and not do something about it. Our companywide Business Intelligence Council ran a survey that found dozens of tools at work. The council identified seven of the top ten of those additional tools for migration to appropriate Dell technologies. We’ll get to the rest of them eventually.
Should you migrate users from other tools in the same project? We can’t tell you how to make that decision for your company, but we’ve put together an e-book, “Statistica: The Great Analytics Migration , Part 3: Technology, that tells you how we made it at Dell. Read the e-book for unique insights into how we managed our migration. We know quite a bit about it.
Just don’t ask us about your timing belt.
While I certainly appreciate Boston for its history, chowder, and marathon, it is the predictive analytics scene that keeps bringing us back year after year. I know that sounds odd, but the annual Predictive Analytics World (PAW) Boston event is a natural fit for Statistica, especially with the recent development of a predictive healthcare track.
Healthcare's connection to predictive analytics arguably extends back to the ancient Greek physician, Hippocrates of Kos, who supposedly provided the instruction, “Declare the past, diagnose the present, foretell the future.” And if that isn't data-scientist-speak, I don't know what is! Hippocrates also touted the medicinal value of food, so I have no doubt he would have prescribed Boston clam chowder for its palliative effects, though I suspect he had his fill of seafood in his time (several hundred years before the birth of Christ).
Back then, of course, the healthcare system--if it could be called such--was comparatively simple, perhaps limited primarily to individual doctor-patient relationships. That simplicity is no longer the norm. During Statistica's mere 31-year legacy, our customers have driven us to develop expertise that guides healthcare organizations through the necessary components of data management and reporting, patient analytics, insurance risk reduction and regulatory compliance. You can learn about some of our healthcare successes in our datasheets, white papers, and videos.
So, when it comes to targeted events like PAW-Healthcare in Boston, we get to be all over the place. Our newsletter readers (yes, you can subscribe for FREE) already received a short list of our PAW-Healthcare exposure, where we will be sharing our expertise face-to-face with modern-day physicians and data scientists at breakfasts, meetups, and presentations. Take a look here and then be sure to register for PAW-Healthcare yourself.
We will also maintain a presence at booth #240, so we hope to see you there the week of September 28.
You can lead a horse to water, but you can’t make it drink.
Image credit: Greg Westfall | Licensed under: CC BY 2.0
If you’re going to spend months putting that water in place by migrating to a new analytics platform, you’d better build a process for onboarding users smoothly so that they drink. Otherwise, you’ll end up with a lot of people looking like the kid in the photo and reciting the caption to you.
I mentioned in my previous post the migration project we underwent here at Dell to move off one of the world’s best-known legacy analytics products and onto Statistica, an analytics platform Dell had recently acquired. How do you onboard users throughout the project when you make a fundamental switch like that?
Where does your user onboarding process live?
Who manages user onboarding in your organization? Usually, the onboarding process lives in IT, which is where it used to reside at Dell. It wasn’t perfect, but we lived with it that way for a long time, along with three burdensome restrictions:
Finite number of licenses: It’s hard to onboard new users when you have a limit on licenses for your analytics software. We had to ask IT for more licenses, and they had to tell us none were available.
License swapping: De-activating and re-activating licenses to move them between data analysts was a drag, but as keeper of the software keys, IT had to be involved.
Doling out licenses carefully: On the rare occasions when licenses were freed up, people had to lobby IT for access to them.
In fact, it took the migration project to finally break this process, and that’s when we moved it out of IT.
The flexible licensing model of Statistica allowed us to change the focus of onboarding users from IT to self-service in the business units themselves. Then, we made an organizational change so that the Business Intelligence Center of Excellence in each business unit made and managed its own strategy for allocating access to Statistica.
That pushed the unwelcome variable (IT’s response time) out of the migration project and kicked off more-efficient onboarding for everyone. Internal customer satisfaction went up when users saw that getting access would be easier in the future than it had been with the legacy product.
We’ve written a new e-book called “Statistica: The Great Analytics Migration, Part 3: Technology.” Read it for an idea of how we at Dell handled the migration from one of the world’s best-known analytics products (you can probably guess which one) onto Statistica while allaying users’ concerns about migration and access.
If you embark on a migration project, whether for analytics or any other companywide function, you’ll need to think about making the user onboarding process palatable.
After all, the kid in the photo is cute for a minute or two, but you don’t want your co-workers looking at you like that for months on end.
“Why would you use a Ferrari to haul a load of dirt?”
Yeah. Why would you?
Photo Credit: Falk Lademann
You wouldn’t, of course, at least not knowingly. But a few months into the Great Analytics Migration I described last month, our migration team found analytics users at Dell who had been doing the equivalent for years. That’s when the question about using a Ferrari as a dump truck started making the rounds.
Better-fitting tools for data management and manipulation
The “Ferrari” was a well-known legacy analytics software product designed to run on mainframes back in the 1970s. (You can probably guess which one it is).
It happens that the product includes tools for data management and data manipulation, so our users became accustomed to using the Ferrari, with its high licensing fees and remarkable analytics capabilities, for “hauling loads of dirt;” that is, manipulating data before analyzing it.
In all fairness, most of the users were just doing what they’d learned from other users within Dell. And they weren’t ruining the Ferrari’s transmission or even scratching the paint. But as an enterprise software company, we have a line of products like Toad Data Point that cost less and are perfectly suited to the task of accessing and working with big data sources. And anyway, the entire migration project was about moving off the well-known legacy analytics software product and replacing it with Statistica, an easier–to-use analytics platform that we had acquired.
So using a Ferrari to haul a load of dirt was costing us licenses that didn’t need to be tied up on data management and data manipulation tasks. It’s an extremely expensive way to perform relatively common functions.
Better yet, as we separated analytics from data management at the software level, we also separated them at the organizational level. In the course of our migration project, we moved data management and manipulation to Toad Data Point, handled by data integration experts, and analytics and modeling to Statistica, handled by analytics professionals. That has put each team is in its respective wheelhouse.
Are you by any chance using an well-known legacy analytics software product to manipulate your data? If so, then some of your users are probably using a Ferrari to haul dirt.
That may be all right with you, but if it isn’t, have a look at “Statistica: The Great Analytics Migration , Part 3: Technology” to find out how we switched analytics platforms worldwide in a matter of months.
Thanks for making Toad and SharePlex #1 again! In the 2015 Readers’ Choice Awards sponsored by Database Trends and Applications Magazine, Toad Development Suite was named Best Database Development Solution, Toad DBA Suite was named Best Database Administration Solution and SharePlex was named Best Streaming Data Solution.
More than 30,000 data professionals read DBTA, and that distinguished following recognized Dell products as winners in three categories and finalists in nine more of the 29 total categories. Having real-world users award us winner or finalist in almost half of all categories is quite an honor, and we’re gratified that you think so highly of our work.
Organizations are becoming increasingly data-driven and look upon data as a competitive advantage. It’s no surprise that so many pros rely on Toad to help them meet their SLAs by being more productive, improving performance and ensuring high-quality applications can be delivered faster in their RDBMS and NoSQL environments.
SharePlex got the nod from DBTA for extending the value of existing systems by integrating with modern systems, making data available to users in real-time and enabling active reporting and fast decision making.
Read all about the Readers’ Choice Awards, including judging criteria and category details. Here is a list of the Dell products that were named winners and finalists in this year’s awards:
Best Database Administration Solution – Toad DBA Suite
Best Database Development Solution – Toad Development Suite
Best Streaming Data Solution – SharePlex
Best Database Backup Solution – LiteSpeed for SQL Server
Best Change Data Capture Solution – SharePlex
Best Data Modeling Solution – Toad Data Modeler
Best Database Performance Solution – Spotlight on SQL Server Enterprise
Best Business Intelligence Solution – Toad Business Intelligence Suite
Best Data Mining Solution – Statistica
Best Cloud Integration Solution – Boomi
Best Query and Reporting Solution – Toad Data Point
Best Data Storage Solution – Compellent
That’s 11 unique Dell products chosen out of 367 products nominated from dozens of vendors. These awards appear also in the August 2015 print edition of Database Trends and Applications Magazine, which has 20,000 subscribers.
If you’re not yet using one of the winning products, see what all the fuss is about. Click on the following links for a free trial version:
Toad DBA Suite for Oracle
Toad Development Suite for Oracle
Thanks again for making us number one! We have lots more on the roadmap for all of these products, so keep your eye on us for next year’s awards.
One way to look at how predictive modeling technology will transform the healthcare sector is to compare it to other industries that were the earliest adopters—and automaters—of such methods. In this Health Data Management article, Dr. Hill asks whether healthcare data science and predictive modeling could be similarly automated? And what exactly would that look like?
Speaking recently at Cloud World, Shawn addressed the great business opportunity afforded by hybrid data environments, where the cloud presents an interesting convergence of technologies and capabilities that enable data processing from almost anywhere—often with the purpose of applying advanced analytics that lead to insights not previously understood.
In his latest article at ODBMS.org, John Thompson explains that the data scientist skills gap will not deter data-driven organizations from achieving the benefits of predictive analytics, thanks to their willingness to pursue collective intelligence as a practical, collaborative workaround that is powerful enough to "change the world."
In this article contributed to Data Center Knowledge, John acknowledges that many organizations collect a disparate mix of structured and unstructured data, and he spells out three information management priorities for DBAs to maintain efficiency and achieve successful integration with analytics downstream.
Deriving value from big data is getting a lot easier, thanks to the continuing breakdown of both technologic and economic barriers. Many people would have you think that big data is a new concept, when, in fact, big data has been around a lot longer than most people think. It just used to take a federal grant to do anything with it.
Now, however, technology has evolved to where it’s possible to analyze data at the speed of business more economically than ever before. This opens doors of innovation to a much broader swath of organizations that can use information to drive their businesses forward—faster, further and more competitively.
But to cross the finish line, you need to answer the following questions:
When I first got into this business, it was enough to figure out how many widgets were sold in a particular region. Now, companies want to know how many widgets were sold in a particular region, in a certain color, to a specific customer, 10 minutes ago. Or, even better, they want to be able to predict the result before it happens. This takes a different approach to information—one that requires IT and business people to be in lockstep before opening the data floodgates. As the saying goes, it takes a village.
The extra effort to align is well worth it, however, as great things can happen when business and IT leaders are on the same page. At Information Laboratory, a leader in the development, manufacturing and distribution of medical devices, getting there meant giving research scientists and engineers ready access to a wealth of production test data. It also meant that both groups needed the ability to perform analyses of manufacturing, quality control and supply chain information to drive better quality and product innovation.
With Dell Statistica, analysts throughout the company can help themselves without IT intervention. As a result, Information Laboratory has taken advantage of its organizational intelligence to streamline and improve manufacturing operations. They’ve accomplished this by quickly identifying and fixing any problems associated with producing hundreds of thousands of device cartridges containing a card with a variety of measurement sensors. The bottom line: Information Laboratory has saved hundreds of thousands of dollars by avoiding the need to scrap a single batch of sensor cards.
Without technology, cost and organizational barriers, companies can drive innovation and deliver collective intelligence to those who need it most. This will be key to achieving success in a data-driven Internet of Things (IoT) world. Despite what some pundits say, IoT is not a new trend, as machines have been pumping out data for a really long time. Companies like Information Laboratory and others in healthcare, manufacturing, and automotive have been working with sensor-generated data for years. What’s new is the level of connectivity between data sources and the availability of cost-effective technologies and analytics tools that enable companies to do more with their data.
And, the more you can do with data, the better positioned you are to handle the massive scale required to integrate sensor-generated information with other digital data. Companies will demand the agility to execute analytics, as well as manage and secure it at the edge near the sensor as well as at the core of the IoT/data environment.
Think about the improvements in healthcare decisions if patients can work more directly with their physicians to fill in information gaps. They can infuse EMR data with self-generated information from their fitness wearables and data gathered at home, such as exercise levels, walking heart rate, recent glucose readings, etc. Doctors can blend that information with vitals data gathered during office visits. Applying industry best practices to the EMR data enables physicians to offer personalized, more effective recommendations designed to improve patient outcomes.
While the healthcare industry offers plentiful examples, any company that breaks down barriers can achieve measurable big data benefits. One major advantage at these organizations: No one is afraid of their data. Also, IT isn’t always the center of innovation as key business stakeholders—from sales, marketing, finance and customer support—are funding major projects.
If you can marry your big data, regardless of where and how it’s generated, with crucial business processes, you’ll win. What’s your strategy for breaking down barriers and opening doors of innovation? Connect with me on Twitter at @shawnrog to share your story.
Follow #ThinkChat on Twitter Friday, September 18th, at 11:00 AM PST, for a live conversation exploring the importance of patient engagement.
Patient engagement can mean many things. Engaging patients through new technologies can help facilitate better communication, education, and collaboration, resulting in better health outcomes. When there’s an increase in meaningful physician-patient interactions, patients are more likely to get involved in their personal health care. This may be just one step that can eventually lead to patient empowerment. Join the conversation and provide input on your perspective of how patient engagement can lead to better health outcomes
Join Shawn Rogers (@ShawnRog), Chief Research Officer for Dell IMG, Dr. Nick van Terheyden (@drnic1) Chief Medical Officer at Dell; Janice Jacobs (@JaniceJacobs44) Social Media Solution Leader, Dell Services, Stephanie Bartels (@Steph_bartels19) Global Solution Leader - Patient Engagement, Dell Services and Mandi Bishop (@mandiBPro) Health Plan Analytics Innovation Practice Lead, Dell Services for this month's #ThinkChat as we talk with the community about the impact of patient engagement.
Join @DellBigData and share your own personal stories about patient engagement and follow #ThinkChat and #NHITweek!
Questions discussed on this program will include:
Where: Live on Twitter – Follow Hashtag #ThinkChat to get your questions answered and participate in the conversation!
When: September 18th, at 11:00 am PDT
About Shawn Rogers
Shawn Rogers is Chief Research Officer for the Information Management Group at Dell Software. Shawn is an internationally recognized thought leader, speaker, author and instructor on the topics of IoT, big data, analytics, business intelligence, cloud, data integration, data warehousing and social analytics. Shawn has more than 19 years of hands-on IT experience. Prior to joining Dell he was Vice President Research for Business Intelligence and Analytics at Enterprise Management Associates a leading analyst firm. Shawn helps customers apply technology to fuel innovation and create value with data.
View all posts by Shawn Rogers |
Late last month, I participated on a panel at the IoT Evolution Conference & Expo, entitled “Unleashing value from analyzing data generated by the Internet of Things.” Joining me were Syed Hoda, CMO at ParStream, and Laurie Lamberth, associate partner at 151 Advisors. Even though it was the last day of the conference, we had the good fortune to share insights with a standing-room only crowd eager to learn how real-time analytics could help generate more value from their IoT initiatives.
It’s crystal clear that IoT can help companies drive significant operational efficiencies and business growth. The trick is figuring out the best way to address the rapidly rising numbers of sensors, embedded systems and connected devices, which are taking data volume and complexity to a whole new level.
A recent report from ABI Research estimates the volume of data captured by IoT-connected devices will surpass 1.6 zettabytes within five years. According to ABI, only a fraction of this data is currently being captured for further analysis because the vast majority is stored or processed locally without a way for it to be easily shared across the enterprise to aid decision making.
Many companies are betting on fog computing to solve this problem by reducing the amount of local data that needs to be transmitted back to the cloud for processing and analysis. Bringing these functions closer to the data source will let companies extend the benefits of cloud computing to their network edge and for faster, easier and more meaningful business insights. That’s where edge analytics come in, as the ability to access time-sensitive, geospatial data opens the door for real-time analysis of data with increased accuracy and context.
Edge analytics will help fulfill the promise of IoT and be a gating factor for scaling IT infrastructures to reliably capture, store and ensure accessibility to data generated by hundreds of billions and even trillions of devices. The sheer volume and complexity of managing all of this decentralized, localized data can quickly overload traditional environments and analysis tools.
Most legacy solutions haven’t been designed to ensure low-latency data access for geospatial workloads at the enterprise’s edge. A lack of protocol standards also complicates cross-domain data sharing while alignment challenges between IT and business stakeholders can quickly derail strategy development and implementation.
That’s why we recommend architecting for analytics, as the success of any deployment will be tied directly to the quality of the insights gleaned. For many Dell customers, this means having the flexibility to deliver predictive analytics at the core while creating a path for performing data aggregation and scoring at the edge.
For Smart Start, a Grapevine, Tex.-based leader in alcohol monitoring technology that manufacturers a line of ignition interlock breath analyzers, Dell devised an edge analytics solution for sending near real-time quality data from its production line. The goal: Increase product quality throughout the company’s supply chain. The solution: A multi-tier automation and data management system that collects data from each of the assembled products, analyzes it using custom algorithms, then aggregates it from multiple manufacturing sites into the cloud so the latest, most accurate details can be presented in reports and visualization tools.
The key to the success of this—and any—deployment is using modular, architecture-agnostic solutions that scale quickly from pilot to production. Of course, bolstering security is critical as an exponential increase in connected devices introduces an exponential increase in security risks. Dell puts security first to ensure our customers don’t end up with the “Internet of Compromised Things.”
At Dell, we practice what we preach with solutions for managing, securing and analyzing data from the data center to the farthest endpoint and along all the networks and clouds in between. We suggest starting small and building on current technology investments and real-world successes. Luckily, our customers are well positioned to take advantage of Dell’s end-to-end hardware, software and services framework to build secure, extensible, supportable, expandable and configurable IoT solutions today.
What are you doing to make the Internet of Things real…today? How do you plan to deploy edge analytics and unlock greater value from your data? Connect with me on Twitter at @alertsource to join the conversation.
We invite you to take a few minutes to explore how the Dell Software Support Portal and a host of tools and capabilities are easily accessible to help you utilize your Statistica products and engage with its experts 24x7x365. From one central location, you will find everything you need, including:
In the event you do need to contact technical support, you can submit a Service Request via the support portal for the quickest and most effective means of connecting with your regional support representative.
Opening a Service Request online ensures:
We are also excited to announce the new Statistica User Discussion Forum, where savvy minds are invited to post content and questions about all things analytics. Our community forum is regularly monitored by Dell experts and peers to provide best practices, seek feedback, and make product suggestions. We look forward to your participation!
For more information, visit the Dell Software Support Portal!
Subscribers to our Statistica Monthly Newsletter have already been made aware of the latest EMA/9sight survey results that found data-driven businesses are becoming more interested in speed than volume. This shift in focus will change market dynamics, and that’s what the survey’s executive summary is all about.
When you read the summary report, you will learn some interesting things:
Clearly, a growing portion of respondents are feeling the need for speed.
Of course, speed is intrinsically related to volume and data structure. It is because of the growing volume and variety of data—especially with the onset of the Internet of Things—that data collection and preparation now require extra attention so that time-to-value (i.e., speed) can be maintained or improved. It is also true that not every business is ready to roll with 100% all-new infrastructure (hardware + software + sensors + workflows) to handle all this change from Day One, which means that most—if not all—businesses are likely implementing their data-driven strategies in piecemeal fashion, with a mix of old and new technologies plus a wish list for more.
This is a good place to mention the “Hybrid Data Ecosystem” as a valid means of addressing the speed issue. EMA originally defined the big data Hybrid Data Ecosystem (HDE) several years ago through end-user surveys and interviews with technology thought leaders, implementation specialists, and software vendor experts. Each platform within a HDE supports a particular combination of business requirements along with operational or analytical processing challenges. Rather than advocating a single data store that supports all business and technical requirements at the center of its architecture, the HDE seeks to determine the best platforms for supporting particular sets of requirements and then links those platforms together. In this sense, HDE makes the most of the messiness of reality and the overlap of various technologies that exist side-by-side in many businesses today.
Let’s face it: conversions, migrations, and upgrades don’t happen overnight and usually involve transition periods that may last into perpetuity. Accordingly, the Hybrid Data Ecosystem is constantly refined. This year, for instance, EMA expanded the HDE scope to include the influence and impact of the cloud on big data environments and data consumers.
You’ve simply got to see the comprehensive infographic that represents the latest HDE model, and read how Dell Software’s big data offerings (including Statistica) map to it. Click the image below to get to the report.
Follow #ThinkChat on Twitter Friday, September 4th, at 11:00 AM PDT for a live conversation and discover how your peers are using real-time data and analytics!
The state of analytics is evolving fast and while more people within the business are utilizing and relying on the value that analytics presents, new demands for faster insights are stretching our traditional analytic infrastructure. Real-time analytics are an exciting opportunity for many companies. Do you have the architecture and tools you need to match the speed of the business?
Join Shawn Rogers (@ShawnRog), Chief Research Officer for Dell IMG, and Joanna Schloss (@JoSchloss), Dell Software's Analytics Thought Leader and special guest Dean Abbott (@DeanAbb) Chief Data Scientist at SmarterHQ, for this month's #ThinkChat and talk with the community about how real-time analytics is impacting your business.
Join us and share your own personal stories about real-time data or analytics!
The #ThinkChat Agenda Includes:
When: September 4th, at 11:00 am PDT
Image credit: Pat Herman
In the most recent issue of our Statistica Monthly Newsletter (yes, you can subscribe for free), our readers were made aware of a new Statistica user forum in our community pages. The new forum is intended to be a true user-to-user community, with discussion threads driven by the users, of the users, and for the users.
The good news is you don’t have to be a Statistica guru to participate! However, this forum does provide you the opportunity to share your best practices, seek feedback on vexing challenges, make product suggestions and expound on specific analytics and data topics that interest you. You can promote yourself by linking to relevant blogs and articles you have written in other forums, too, such as LinkedIn groups. This totally new forum will be regularly monitored by Dell experts and peers alike, so you can anticipate that your posts will be addressed even as we build our new community audience from scratch.
Why is this a big deal? Because the development of the Statistica platform itself is a response to your real-life use cases and the industry trends that affect you. And because you can improve your own knowledge base (and your personal brand) by collaborating with fellow Statistica users. You never know where the next big idea may come from. Here I will happily defer to greater minds than my own:
The sun never sets on the Statistica empire, because there are over 1 million Statistica users in dozens of countries around the globe, in industry and academia and government. As a Statistica user, you are never alone. So share the forum link with your peers, and we look forward to your participation.
For more information, subscribe to the Statistica newsletter >
Months of planning complete, hardware and software procured, associates prepped. The sails are set, and the stars are aligned to flip the switch on a major analytics platform migration. That’s what the buildup felt like when Dell was ready to start moving users from our legacy analytics platform to Statistica, an easier-to-use and lower cost solution, as the company’s analytics platform.
In a previous blog post, we covered the lessons learned from process planning. It bears repeating that the time Dell spent working through process-oriented questions positioned the organization to start moving users onto Statistica on schedule. Even still, when the actual migration is in progress, some new and perhaps surprising processes pop up.
What actually happens when a company migrates to a new analytics platform? Find out in the Dell-on-Dell case study. The e-book, “Statistica: The Great Analytics Migration,” is available for download.
During our migration, Dell realized that it was the business leads in the Centers of Excellence (CoEs) that had the best glimpse into progress, knowing how close each user or department was to migrating completely off the legacy analytics platform. The CoEs also had the insight into unexpected roles and tasks users and managers took on along the way. Let’s look at three:
Working in two platforms: Perhaps it’s not surprising that a migration of this magnitude would put added strain onto employees taking part in the migration, but there are only so many hours in a workweek. If there is an expectation on teams to add more tasks onto the daily workflow, plan deadlines accordingly.
Double checking, for a while: Since analytics are pervasive at Dell and run mission-critical business applications, to ensure the integrity of results and minimize risk during the migration, Dell ran Statistica and the legacy analytics system concurrently to make sure everything was operating as expected. It was a process that was unexpected and time consuming but necessary before the legacy analytics platform could be turned off.
Trying to align individual business groups: You’ve heard of the phrase herding cats. It’s certainly a good comparison to managing a migration in which various groups operate on their own timeline, working toward their own objectives. But success means getting all groups to completion by the overall deadline.
During the migration, we encountered some additional necessary processes to move the migration along. For instance, the team realized it was important to stop and correct inefficiencies, despite the reluctance to take any time away from moving forward. Managers also experimented with different motivation tactics, including contests. To find out more about the actual migration, download the Dell-on-Dell case study, “Statistica: The Great Analytics Migration, Part 2: Process,” which recounts ways to get all teams to stick to the deadline. Would you expect pressure in your organization to come from the business leads or IT?
There’s more to come from Dell on its Statistica migration. In part 3 of the e-book, we’ll cover all aspects related to technology components of the migration project — from architecture to tooling. In the meantime, read part 2 to get more insight into our migration process.
Subscribers to the Statistica Monthly Newsletter already received a heads-up about several events coming over the next few weeks, including the combined Statistica-Toad tech webcast on August 27, called, “The Smart Data Analyst’s Toolset.”
Maybe the Toad name doesn’t mean much yet to longtime Statistica Enterprise users, but it will. That is because Statistica—historically a very robust and IT-friendly analytics platform on its own merits—is now integrated with Toad Data Point and Toad Intelligence Central (TIC), fellow members of Dell Software’s information management portfolio.
What does Dell Toad do for you and why should you care? Well, that’s what the webcast will cover in detail, so let’s just summarize here by saying Toad opens your door to a whole new world of big data interconnectivity upstream of Statistica. Toad Data Point offers self-service data access, integration, and data preparation functionality with relational and non-relational sources. (And who wouldn’t be happy with more tools for handling data preparation, one of the leading sources of posterior pain among data analysts?) Meanwhile, on the server side, TIC seamlessly connects data, users, and files for well-governed collaboration.
Once again, this is Statistica’s way of “playing nice” with your existing IT assets and software components, a practical trait for which our platform has long been hailed by satisfied users worldwide. If you already have Dell Toad in place, now you can call Data Point and TIC directly from within Statistica Enterprise. If you do not yet have Toad in place, you really should be taking a look at Toad now!
Either way, the August 27 webcast will showcase the new Statistica-Toad connection that enables you to provision data sources across platforms like never before in order to produce a single system for data profiling, cleansing, analysis, modeling, deployment, and scoring.
A powerful connection, indeed.
Uttering the word “process” will likely send a shudder down a business or IT pro’s back, anticipating the planning, resources, timelines and deadlines all part of said process. Despite resistance, in many cases and especially when facing a major IT migration, it’s the process that ensures all stakeholders are satisfied by the result.
Dell recently migrated its entire legacy analytics platform to Statistica – and we’re hoping our experience will help other companies in their own migrations. In Part I of the Dell-on-Dell e-book about our Statistica deployment, we reviewed the steps we took to get our people – all associates that touch the analytics platform – on board at integral parts of the migration.
The fear of a major migration shouldn’t stop your organization from deploying a better solution. Learn how Dell moved to a new analytics platform in the e-book, “Statistica: The Great Analytics Migration.”
In Part II of the e-book, we’re addressing several process-orientated challenges and questions. A few important process lessons learned:
The migration “process” starts before you even realize – and no one likes surprises: Even if the executive staff or IT decision-makers share plans with all stakeholders as soon as the project is confirmed, anticipate that some savvy stakeholders already suspect a change afoot. For Dell, our associates expected a change when Dell acquired Statistica. For other companies, it could be a poor performing analytics platform or a change in executive leadership which could signal a migration. Either way, informing those involved sooner rather than later will limit the number of people caught off-guard.
Investing time in laying the groundwork is well worth the effort: Before a single action was taken, Dell pooled every available resource from the Statistica team to truly understand the platform and IT requirements. That process alone took a month. But it was valuable time spent to plan and align expectations. Better informed stakeholders could more quickly and accurately answer other process questions:
- How long with migration take?
- How many users really need to migrate?
- How fast can we be up and running?
Centers of Excellence are monumentally important: Creating Centers of Excellence (CoE) sounds like a process in of itself, doesn’t it? However, it’s these groups of stakeholders organized by business function that help the organization with a migration overall. CoE provide valuable insight on how the project can be helped by or can help each business function. At Dell, our CoE identified analytic platform users that should be part of the migration, which helped our team resource appropriately.
Process-oriented planning is tough and it’s a challenge to get all associates on board with the rigor necessary to make a migration successful. But the advantages of process planning far outweigh the perceived time savings expected to be gained by rushing through a migration. In fact, with pre-planning Dell was able to get hardware online and ready to accept users in a five week period – a task that should take 3 months!
For more insight into how Dell ticked through the process-oriented questions we faced, download the e-book, “Statistica: The Great Analytics Migration, Part II: Process.” Whether your organization is migrating 10 users or 10,000 users in a 6-week or 6-year project, our answers may help your migration process go smoothly.
Follow #ThinkChat on Twitter August 13th at 11:00 am PDT for a live conversation about how big data innovation is reshaping enterprise security!
Join Shawn Rogers (@ShawnRog), Chief Research Officer for Dell IMG, Joanna Schloss (@JoSchloss) Dell Software’s Analytics Thought Leader and John Whittaker (@Alertsource) Executive Director, Information Management Group at Dell for this month’s #ThinkChat as they discuss how big data innovation and security are creating opportunities to excel and new challenges for the enterprise.
Tweet with us about how big data represents innovation and many companies are jumping on the band wagon to be first to gain competitive advantage using new data type, insightful applications and new data frameworks while trying to adapt to new security concerns, ricks and best practices.
Join the conversation!!
1. Is big data a challenger or enabler to enterprise security? Do today’s new solutions enable us to thrive?
2. Where do privacy and big data meet? Innovation can breed issues that go beyond security. What precautions are critical?
3. Are there best practices for non-relational data sources with regards to security?
4. Are big data solutions like Hadoop enterprise secure? Do they meet the needs of today’s business?
5. Do new big data frameworks create opportunity for better security analytics or create complexity?
6. IoT is exciting but it opens the door for greater security challenges. What are the top best practices?
7. Speed is at the heart of many security analytic scenarios, how does big data create value when speed is critical?
8. What are some Big Data factors that change the way we approach data protection?
9. How have your data protection needs changed over the past 2-3 years?
10. Is there a difference between securing and governing big data versus traditional data?
When: August 13th, at 11:00 am PDT
The role of the data scientist has become a bit of a legend in the analytics industry these past few years. Many of my DBA friends have gained instant career advancement with self-appointed promotions by adding this job title to their LinkedIn profiles leading to raises and general admiration from their peers. Tom Davenport and D.J. Patil wrote about this topic back in 2012 for the Harvard Business Review and it caused everyone to take notice of this analytically driven job. The role of the data scientist has evolved these past 3 years and so has the definition.
In this week’s #ThinkChat segment Tom and I discuss how the role of data scientist is changing, where they fit in an organization and whether or not everyone who has stealthily added the title to their resume owes Tom a cut of their new found wages!!
#ThinkChat Conversation with Tom Davenport Part 7 of 7
To view other segments in the #ThinkChat series click here.
This short video is part of Shawn's #ThinkChat video series with Tom Davenport, professor of management and IT at Babson College. Here the two briefly discuss how the landscape is evolving and how a growing community of advanced analytic users and enablers are fueling change. It’s the age of the analytic amateur and the semi-pro!
John takes a look at the many ways organizations can benefit from a decentralized, collaborative approach to analytics, an approach made realistically possible for more and more companies with the advent of simple, cloud-enabled tools.
Change can be daunting, especially when it involves unfamiliar technology to accomplish daily tasks. So when an entire workforce must migrate to a new software platform after years with legacy code, what kinds of questions do they ask? And how are their fears replaced with curiosity? David Sweenor introduces the first chapter of a three-part e-book describing Dell’s own recent migration.
In his latest article published at www.HealthDataManagement.com, Dr. Hill discusses the technology revolution that will involve predictive analytics in thousands of healthcare applications and workflows, and he shares his perspective regarding the industry's various opportunities, disruptors and hurdles.
In the latest issue of Statistica Monthly News (yes, you can subscribe for free), our readers found a link to a webcast that talks all about Statistica’s new partnership with Microsoft, a relationship that produces some incredible hybrid cloud functionality for data analysis using Azure Machine Learning (ML).
We are talking about a hybrid cloud solution whose powerful functionality completely belies Azure’s namesake: a shade of bright blue often likened to that of a cloudless sky. Cloudless? Hardly. The Statistica-Microsoft partnership is all about the Cloud!
The fun story in the webcast describes how one website was running an analytics program as an API on Azure. Designed to guess ages and genders of people in photographic images, the site was expecting a few thousand submissions, but it went from zero to 1.2 million hourly visitors within just two days of going live, and up to seven million images per hour. By day six, 50.5 million users had submitted over 380 million photos! Normally, we would hear about sites crashing with such a viral overload. But this site kept humming along even when the action ramped up so dramatically, primarily because Azure scaled dynamically as intended, handling the unforeseen load like a champ.
Think about embedding this kind of cloud access and flexible scalability as a directly callable function inside Statistica—well, that just makes way too much sense, right? But that is what’s happened! Azure ML is really a development environment for creating APIs on Azure, with the intent to enable users to have machine learning in any application, whether that is a web app or a complex workflow driven by Statistica. For instance, you can host your complicated models in the cloud with Azure and run non-sensitive, big data analytics out there—a very practical time saver and money saver. Then you can bring those analyzed results back down to join perhaps more sensitive data and analytics output behind your firewall. You can learn more when you watch our “Cloud Analytics” webcast.
You might have heard that the term "big data" is over-hyped. And maybe you have heard it already slid into the “trough of disillusionment” (as far back as early 2013, if you believe everything you read on the Internet). Even assuming these assessments are true, the fact remains that the term itself remains relevant and apt for many a business person still seeking best tips and practices for developing analytics projects.
In marketing-speak, the “big data” slogan has stickiness. But for all its ubiquity—or, perhaps, because of it—the term "big data" remains something of an enigma, a source of curiosity for business leaders and data executives worldwide. That is to say, business people wanting to get into analytics still respond to that term more than others.
To be fair, the longevity of “big data” works in its favor. Currently, people search for “big data” on Google an average of 60.5K times per month, perhaps because the term seems all-encompassing and broadly descriptive, a good place to start asking questions. Meanwhile, more recent phrases—despite their own merits and relevance—are not sought out nearly as often. For instance, “internet of things” currently averages only 40.5K monthly searches, and “predictive analytics” clocks in at 9.9K. And even if you think “cloud analytics” is destined to be the Google rage someday, right now that phrase averages only 390 searches per month. (That’s not a typo: it is 390.)
This popularity is why we still like to use the “big data” moniker when talking about Statistica’s analytics prowess. Did you read our July issue of Statistica Monthly News? (Yes, you can subscribe for free.) In the sidebar list of events, our subscribers have already seen that we are offering a free Tech Webcast on July 30, “Statistica Eats Big Data for Breakfast.” This webcast will be presented by Mark Davis, the founder of Kitenga and now Distinguished Engineer at Dell Software. He will be focusing on the newer big data capabilities within Dell Statistica 12.7 and how those capabilities can benefit businesses in a variety of use cases, perhaps even in your industry.
Register today and spread the word!
Dell’s SAS migration began shortly after we acquired the advanced analytics product Statistica. Within weeks, we had decided to move all of the company’s analytics users from SAS to Statistica. After assessing how the migration would affect employees, the next challenge was to get everyone on board.
Change can be daunting, especially when it involves embracing unfamiliar technology to accomplish daily tasks. Our main strategy for getting employees on board was to replace fear of an unknown product with curiosity about how best to accomplish analytical tasks with it.
Download the e-book, “SAS to Statistica: The Great Dell Migration – Part 1: People”
Understanding the Reactions
You don’t expect any sweeping change to be widely met with open arms and high-fives, so we were certainly prepared to address concerns from the Dell workforce. When the news was announced, most reactions fell into three buckets:
Most users had never heard of Statistica and many of them felt a deep-seated career-attachment to SAS. Once we realized that, we started working on ways to replace their fear of an unknown product with curiosity about Statistica.
Addressing the Concerns
It was our responsibility to arm employees with as much knowledge about Statistica as possible. We began by arranging communication between our employees and our migration leads from Statistica, to show them that their long years of work would not simply be discarded.
The leads examined the techniques and functions our users had worked with in SAS – K-means clustering, polynomial regression, GLM, ARIMA, neural networks, and more – and demonstrated how to replicate and enhance them in Statistica. Nearly all the techniques they had used in SAS were in Statistica, and were easier to implement. In short, they didn’t need to rewrite thousands of lines of code; they simply dragged and dropped icons on the Statistica workspace.
While the discussions eased their concerns somewhat, getting full-scale buy-in required more comprehensive onboarding. We’ll take a closer look at those strategies in an upcoming post.
Download the Dell-on-Dell case study, “SAS to Statistica: The Great Dell Migration – Part 1: People,” to learn more about anticipating the reaction among your business users when you undertake an analytics migration.
New data and new insights are giving way to new data driven products and contributing to the digital economy. Companies with data driven insights to markets, buying behavior, product performance, and procedure execution are able to leverage this insight to produce and build new innovative products based on this data. These new data products can create meaningful revenue opportunities and enhance customer care and overall corporate execution.
In this week’s #ThinkChat segment Tom and I discuss how companies like GE, Monsanto, Google and Facebook are leading the way with data product innovation and how traditional smaller companies can get in on this opportunity to turn their data into new services, products and revenue streams.
#ThinkChat Conversation with Tom Davenport Part 5 of 7
To view other segments in the #ThinkChat series click here