Our community is talking about the new Dell Technologies. Join the discussion in the Dell EMC Community Network:
Last week Michael Dell and Satya Nadella announced the industry first integrated system for Hybrid Cloud at Dell World. At Dell we believe that the future of cloud is Hybrid and for those IT organizations and Services providers looking to rapidly deploy a cloud solution, we have a fully integrated and modular system that can be customized to meet your needs.
For a few years now, Dell and Microsoft have been working closely on bringing the learnings of building and operating one of the largest public clouds to the data center. The goal to provide an Azure-like experience to Enterprise customers and Service providers. Last year, we unveiled Cloud Platform System (CPS) Premium that has revolutionized how customer deploy and operate a private clouds at a large scale. Over the past year, we heard your feedback and have built a mid-size hybrid cloud with the same principles as CPS premium, but more modular with the ability to start small and pay as you grow. The Dell Hybrid Cloud System for Microsoft CPS Standard is the second of the Cloud Platform System (CPS) family of products.
Key features include:
Our goal is to enable you to confidently adopt cloud in your data center with predictable results with a solution aiming to lower your risk for adoption, streamline operations and simplify supply chain.
Goto dell.com/dhcs for more information and stay tuned for more blogs on this topic.
This is the final post in a series of User Experience (UX) topics on the Dell Cloud Blog. The first four topics were UX Culture at Dell Cloud Manager, The Benefits of a UI Pattern Library, Docs Day: UX Tested, Engineer Approved, and Best-in-Class User Research and Persona Building. We look forward to sharing more UX strategy with you in the future!
Dell Cloud Manager recently added a customizable catalog feature that allows admin-level customers to upload blueprints and make them easy to deploy by their end users. In the original feature, the user experience (UX) team added support for uploading blueprints through the user interface. This was in addition to the ability for users to upload through the API. We received great feedback on the catalog and upload capabilities, but one key use case that was not in the first release was the ability to track versions. Through continuous UX research, we learned that users could benefit greatly from the ability to maintain and track multiple versions of a single blueprint. For example, an administrator could test a new version before making it publicly available in the catalog. Also, if the blueprint administrator discovered a problem with a particular version, they could roll back to a previous, more stable version. This missing support for versions became our next goal for feature release.
A Lean Team Effort
At Dell Cloud Manager, we use lean teams to quickly research, design, develop, and test a new feature by fully dedicating a cross-functional team to focus on a measureable goal. The blueprint versioning feature was a lean team effort that included representatives from UX, front-end engineering, back-end engineering, and product management. All of the participants were remote, and all were fully dedicated to minimize distractions. This allowed us to work very quickly and deliver the feature to our customers in record time. The UX team kicked off the collaboration by presenting an initial set of mockups that were reviewed and discussed with the entire team. We considered numerous options when deciding how the feature could work and continuously revisited—and even refined—our primary goal. Once we came to a consensus, all team members worked in parallel, each of us with a common vision for the feature.
Three business days after the initial kick-off meeting, the UX team ran a set of hour-long usability studies. The participants completed core tasks using an interactive prototype, developed in collaboration with front-end engineering. This prototype eventually became our final software implementation. Over the next week, we iterated on the UI design and ultimately “hooked it up” to the back-end engineering work.
The usability studies validated our assumptions, as well as revealed areas where we could improve our design and implementation. For example, we identified a subtle labeling issue. Users were tripped up by a dialog button label named “Edit version.” In this part of the workflow, users had already made their edits and wanted to “Save”, not “Edit.” We also found design and implementation gaps. Users were confused as they created a new version because there was no feedback that the version had been successfully created. The screen refreshed to the initial, default view, and users were left wondering if their changes had been saved. These issues were identified and were quickly fixed.
The most significant finding of the usability study related to our capability set. We experimented with the idea of allowing users to edit their blueprints directly in Dell Cloud Manager. However, we realized that the inline editor we provided was competing with the user’s external versioning system. If a user edited within Dell Cloud Manager, their version would not be recorded in their preferred version control system, so we decided to remove this feature. On the surface it might seem like we reduced the capabilities of Dell Cloud Manager, but in reality, it clarified the capability, reduced confusion, and led to a superior user experience overall. By removing the inline editor, there is no longer confusion about where a user should edit files. And, there is no question about where version control is performed and managed. Using Dell Cloud Manager, our users can see their versions and switch between them. Any number of external tools can, and should, be used alongside Dell Cloud Manager to create and manage versions.
Lean Team Impact
The blueprint versioning feature was designed, developed, tested, documented, and deployed in 4.5 weeks. The tight collaboration of UX, engineering, and product management made it possible. The entire team was focused on building the essential components to support the best user experience. From reviewing the initial mockups to iterating on the UI design as a result of the usability study, the UX team was able to take user feedback and keep the lean team focused on the needs of the end user.
The Dell Cloud Manager User Experience Team welcomes your feedback and suggestions! If you’d like to join our research panel and contribute your voice to the development of Dell Cloud Manager, please visit: http://www.enstratius.com/support/usability.
Search the internet for the phrase "DIY hybrid cloud" (do it yourself) and the results will vary from sponsored links for a multitude of different reference architectures, managed service offerings and even a few for building a hybrid cloud in your garage, all of which make the idea of DIY hybrid cloud sound easy.
In 2013 Gartner predicted by 2017 nearly half of all large enterprises will have hybrid cloud deployments that does indeed seem to be the direction most enterprise customers want to go, but the journey to get there can be complex, difficult and painful. In fact, many analysts put the current level of hybrid cloud adoption below 20%.
The business demands that once drove the increased adoption of virtualization are now also driving the move to adopt private, public and hybrid cloud solutions but many IT operations are trying to meet the demand from a virtualization up perspective with legacy limitations on agility, choice and governance.
The increased demand for rapid adoption often pushes IT to deliver to an artificial deadline when they really need to take a step back and ensure that solutions for security, disaster recovery, operational efficiency, performance, scalability, capacity and financial planning are all in place.
The critical needs for successful On-Demand Hybrid Cloud are very different from the virtualization model.
- Cross platform orchestration and operational automation
- Elastic, consumption based model and measurement
- Pay-as-you-go funding
- Self-service provisioning
- Seamless datacenter extensions
- Flexible workload mobility
- Federated identity management
- Hybrid application management across the lifecycle
For businesses to realize the promises of efficiency, reduced costs and competitive advantages of cloud technology the adoption process needs to be easy, seamless, and non-disruptive for the IT department to plan, deploy, implement and manage;
IT operations must be viewed as and truly become a broker of services, rather than being focused on buying the components of a virtual datacenter infrastructure, providing desktop and server support and administrating workloads.
IT must become focused on delivering value through automation of process, quality of service and driving innovation, while also managing on premise and off premise cloud environments.
Enter the Dell Hybrid Cloud System for Microsoft
The Dell Hybrid Cloud System for Microsoft is about operational efficiency, not technical infrastructure. It is a fully integrated system that provides on demand self-service with resource consumption and scalability on demand, 24/7/365 availability, archiving, backup, recovery, security and automated failover.
Dell and Microsoft minimize risk and break through the barriers of DIY hybrid cloud by providing a well-engineered and fully integrated turn-key system making hybrid cloud much easier to adopt, deploy, implement, and manage over an entire lifecycle.
Together Dell and Microsoft are providing the vehicle for customers to rapidly accelerate or gradually pace themselves from a virtual datacenter infrastructure, existing private or public cloud to a much more agile, controllable and future proof hybrid cloud.
The Dell Hybrid Cloud System for Microsoft helps customers extend their data center beyond current boundaries without forcing them to “cloud everything" as they strive to meet the demands of their end users and customers.
Before you search the internet for “DIY hybrid cloud” watch this Dell Hybrid Cloud System for Microsoft video.
The video briefly tells the story of the difficulties of designing, building, configuring, implementing and maintaining a “Do It Yourself Hybrid Cloud” compared to the benefits of the Dell Hybrid Cloud System for Microsoft the simple and fast way to get the best of private and public cloud in one easy to deploy, easy to manage, hybrid cloud solution.
Then read this blog by Glenn Keels Executive Director, Product Management and Marketing, Dell Engineered Solutions and Cloud and then visit www.dell.com/DHCS and start your own future-ready journey to hybrid cloud.
When you were in school, did you ever look up at your teacher and think, “There’s someone who knows data analytics?”
Sure, teachers somehow manage to size people and situations up pretty quickly. So maybe you looked at your teacher and thought, “How did she figure out who shot the rubber band into the ceiling tile?” or “How did he know Bill and Marco were chewing gum?”
Click to enlarge
But you probably didn’t associate your teacher with big data and analytics.
In my last post, I mentioned that education generates mountains of data, but teachers rarely have the analytical tools to work with the variety and volume of that data. That’s changing with the advent of learning analytics, which applies predictive analytics to improve education for all students.
A new report from THE Journal, called Game Changer: How Predictive Analytics is Transforming K-12 Education, highlights three tips from the New American Foundation for successfully integrating big data into the K-12 classroom:
If you’ve been in school recently or have children there, have you thought about the amount of data that a student generates? Assignments, grades, standardized test scores, attendance, health, financials and personal data are just the start. Then there’s the student’s online footprint: websites, passwords, posts, comments and uploaded/downloaded documents.
Who’s protecting all of that information and keeping it private? Big data shapes the way in which students look at privacy. As they generate, capture and interact with data, they begin to recognize the importance of privacy and data security.
The Game Changer report I mentioned includes five suggestions from the National Center for Education Statistics for managing information in student education records:
Breaches of financial, commercial, *** and government data grab the headlines, but there is plenty of big data in education, too. How do you think it’s affecting student privacy? Let me know in the comments below.
Back in March, Cameron Haight posted a blog about APM vendors needing to bring some joy to the market.
As we have been developing the Foglight APM SaaS Edition, we have been focused on simplifying the APM end-user experience and finding novel ways of enabling users to discover the information hidden in their APM data. I have heard nice feedback about the Investigate portion of the product and how it lets users interact with their raw trace data in something way more than a clinical way.
For those of you not familiar yet with the Investigate UI, it is designed to give advanced performance engineers access to raw trace data. We set out to solve a tackle a challenging problem with this feature—how to help users visualize and understand the impact of multiple dimensions on application requests. As my co-worker Steve Fox pointed out (about the 26:30 point in the video) in a great Velocity session, graphs and charts are limited to 3 dimensions of data. So, when people want to look at more than 3 dimensions, they have to do things like page back and forth between different views, trying to hold the context of the previous view, or craft their own portal style view with custom, but unlinked 2 and 3 dimension charts where they have to infer the relationships across the charts.
What we did to solve this problem was create configurable visual filter chains that link and cascade data from one graph to another. In other words I can create and show multiple side-by-side charts, but by selecting a set of data in one chart based on some dimension, I am now able to pass just that new selected set of requests to rest of the charts in the filter chain.
In the screen shot below, you can see an example of the default template, which contains 4 different charts:
Now, look at screenshot below. Notice that when I selected requests in the first chart, it limited and passed the 26 specific requests to rest of the charts, updating to reflect how the different dimensions are interacting together. In this case specifically, I can see that the bad requests are spread out across different locations (not a geographical problem), that they are distributed among browser and mobile browser hits (not a mobile problem), but that they are all related to a single request type.
Of course, we provide access to the individual traces so the developers can isolate the specific problem in the code, but I’ll discuss that UI another day.
Out of the box, there are 16 different templates to get you started, but you can also create your own custom templates using 9 different chart/graph types spanning the 60+ properties associated with request data.
Come try it with your own data
About Rick S
Rick is a Senior Product Manager working on APM in Foglight.
In a demonstration for CRNtv, Danny Stout shows the simplicity of using Statistica to reduce customer churn—that is, to predict and reduce the likelihood of customers leaving a company's consumer base—while cutting costs and building revenue. Danny further describes the value of Dell's end-to-end solution that comprises hardware, software, and services.
John Thompson recently described to Datanami how the internal data migration to Statistica from “a legacy software provider in North Carolina” has quickly reduced non-standard KPIs by 50 percent and synchronized 99 percent of incoming information with existing records. Impressive!
Schools and teachers have long scrambled to keep up with the progression of educational tools. But without adequate support, pushing technology into the classroom becomes a source of frustration for teachers and a disservice to students. Joanna Schloss addresses the recent report Dell published with THE Journal about the transformative effect predictive analytics and data management can have on K-12 education.
This Halloween, perhaps the creature you fear most is the 800-pound monster in your I.T. department commonly referred to as “change.” However, when it comes to migrating from one data analytics system to another, Dell is an experienced monster-killer. Dave describes how we reduced the fear of change during our own internal migration, so now the only scary thing is the thought of going back to that legacy system.
It’s 4:57 p.m. on Friday and you’re counting the seconds until you can go home. And that’s when you get the dreaded call from your boss.
“Hi, Peter. Bill Lumbergh here.”
“You remember that report you sent me two weeks ago?” he asks. “Yeah, I’m gonna need you to resend it now, but update it with fresh data. I’ve got a flight to Chicago this weekend, and I want to read it on the plane.”
Now it’s 4:59, and you’ve got a lot more than a minute of work to do. Unless you have the kind of business analyst toolset that can run reports for you, automatically.
That’s the sound of angels singing
With report automation, you can impress your boss and save your sanity. That’s why it’s an absolutely critical component of any data analytics solution. It lets you schedule and generate reports in just a few clicks.
Oh, go on!
All right. A good analytics tool will make it fast and easy to automate your whole data preparation workflow. It’ll grab all the data requested and automate every task, from data cleansing to data profiling to provisioning datasets and reports.
Then, it’ll output it to any format you want, whether that’s a report, a spreadsheet or an HTML page. And you can do all of this proactively, so you look like you’re working harder than ever ― when you’re really just outsmarting the system.
Employee of the decade
So you know that report he’s always asking for first thing Monday morning, when you’re busy doing something else? Now you can set it to run and send by 6 a.m. each Monday, so he thinks you’ve been up since 4 working on it. Do the same for your Friday evening report, and everyone will leave the office happy and on time.
New on-demand webcast: SQL for the Business Analyst
See how to take full advantage of report automation when you watch our on-demand webcast, SQL for the Business Analyst.
Image credit: Kristy Hom | Licensed under: CC BY 2.0
Protecting your IT environment today is extremely complex and can be a very scary task. You’re haunted by increased security threats, malicious attacks, BYOD, the Internet of Things (IoT) and new network-connected devices that you don’t even know about.
Consider the number of operating systems you are now slated to secure, the number of BYO devices that are a normal part of your organization’s operation in the form of smartphones, tablets and even network connected devices such as printers, scanners and kiosks. The freedom offered by mobile devices and the BYOD trend opens your organization to a myriad of security risks. Your users want mobility and the flexibility it provides, but you have to balance it against your organization’s need for security and control. Meanwhile, security threats continue to grow in both number and sophistication. If you’re the person in charge of ensuring your IT network and systems are buttoned up from malicious intruders and a growing world of creatively uncovered and exploited vulnerabilities, your job could literally be on the line with a single network security breach.
Internet of Things
It’s also clear that the IoT is here to stay and will grow exponentially as more smart devices enter both our personal and business lives. New systems and applications are easier than ever to set up and maintain, which often results in users setting these up on their own – leaving you with applications and systems you can’t protect. Unfortunately, many users are unaccustomed to thinking about issues like security and backups, or they are simply willing to sacrifice security for expediency.
Despite these security threats, protecting your IT environment doesn’t have to be a scary undertaking if you follow these readily available security safeguards:
In addition to the steps outlined here, we invite you to watch an on-demand webcast, Protecting Your Network and Endpoints with the SANS 20 Critical Security Controls, addressing the challenges of protecting your IT environment. In this webcast, presented by internationally recognized security expert Robert Franklin Smith, you’ll be introduced to a practical and straightforward framework that provides 20 actionable security controls with specific recommendations on how to implement them at a technical level. The webcast will briefly introduce you to the entire list, but will focus on seven controls that relate specifically to endpoint security.
Watch the Security Webcast
About David Manks
David Manks is a Solutions Marketing Director for Dell Software focusing on endpoint management and security products.
View all posts by David Manks |
Photo Credit: Flickr.com Licensed under CC BY 2.0
“I live to break down barriers.”
“I embrace obstacles and I love overcoming them.”
“Life without adversity is boring.”
Do you have people around you who say things like that? They’re kind of exhausting, aren’t they?
One person’s barrier is another person’s challenge, I suppose, but most of us would rather take our analytics plain, thanks.
In my previous post about our new e-book, Break Down the Barriers to Better Analytics, I described the corporate obstacles to getting the most out of analytics. In this post I’ll cover more of what gets in the way of successfully deploying analytics, this time describing the IT obstacles.
Corporate barriers apply to nearly everyone in the company, but there are IT-specific barriers too:
When you overcome the barriers in your IT group, you can speed your time to market with advanced analytics and respond quickly and accurately. Our customer Danske Bank Group ended up overcoming both internal, IT barriers and external, competitive disadvantages by turning new analytical models around faster.
Have a look at our new e-book, Break Down the Barriers to Better Analytics, for more insights into the changing face of analytics; data preparation and data blending; and the corporate- and IT-centered barriers to using analytics efficiently in your organization.
You’ll find plenty in the e-book to help you break down the barriers and get on to better analytics.
About David Sweenor
From requirements to coding, reporting to analytics, I enjoy leading change and challenging the status quo. Over 15 years of experience spanning the analytics spectrum including semiconductor yield characterization, enterprise data warehousing, reporting/analytics, IT program management, as well as product marketing and competitive intelligence. Currently leading Analytics Product Marketing for the Dell Software Group.
View all posts by David Sweenor |
I love trick-or-treaters. As soon as dusk hits, our neighborhood gets swarms of cute little kids in adorable costumes, delighting us with their smiles as we drop a few pieces of candy into their pumpkin-shaped pails. As the night goes on, however, the kids get older,
Photo Credit: Steven Depolo Licensed under CC BY 2.0
more demanding and less creative with their costumes – you get the teenagers wearing pajamas asking for an extra handful and then (sometime around 10PM or later) you get very tall “kids” in street clothes wearing Friday the 13th masks banging on your door, demanding your king-size Snickers bars.
Do I tell them that they may have outgrown the trick-or-treating stage? Nah. I toss a few candy bars through the door and hope they’ll move along. Someone will eventually tell them. It just won’t be me.
Are you dressing up your analytics tool for non-analytics tasks?
In my last two posts in this platform migration series, I talked about how change can be a scary thing and how the process involved can seem just as frightening. But trust me, some of the technology challenges we faced during our own migration were also pretty disturbing.
Much like those king-size “kids,” at Dell we discovered that we had outgrown a well-known legacy analytics platform. We had also just acquired our advanced solution, Statistica – an easier-to-use analytics platform – and needed to be able to tell prospects that we use the product we are promoting. So we embarked on a great migration from a very expensive and dated analytics platform to Statistica.
During the migration, our team found that a number of analytics users at Dell were using the old product to manage and manipulate data before analyzing it. Sure, it could do the job but it was a really expensive way to move data around. And with all of the legacy code that was required to push data, it was unwieldy and unmanageable.
Unlike those grown-up trick-or-treaters, we quickly got out of denial and decided to do something about it.
How using the wrong tool for the job can haunt you
Using the wrong tool for the job may not sound like a big deal, especially if it gets the job done all the same. But in our case, it wasn’t just about using a tool with extraordinary analytics capabilities for ordinary jobs — it was also costing us licenses that didn’t need to be tied up on data management and data manipulation tasks. We couldn’t really blame our users for doing what they were likely taught by other Dell users at the time. But, it was an extremely expensive way to perform relatively common functions. And as an enterprise software company, we should know better.
So how did we overcome our fear of the unknown and commit to something that kind of scared us? We had to separate analytics from data management at the software level and at the organizational level. We moved data management and manipulation to our Toad Data Point solution, and analytics and modeling to Statistica. With each team empowered with the right tools, we now have data integration experts focused on data management and analytics experts focused on analytics. Sure, there were some awkward moments during the transition. But once you figure out what really works for you, life is simply easier ― a lesson those teenage trick-or-treaters will learn soon enough.
New e-book: The Great Analytics Migration
If you’re using an expensive analytics software product dressed up in a data preparation costume or you’re facing a migration project of your own, read our e-book Statistica: The Great Analytics Migration. You’ll discover more information on how we successfully moved hundreds of users to a new platform in a matter of months, provided everyone with the right tools for the job and saved a ton in licensing fees. Follow our lead, and you can do it, too.