Dell Community

Blog Group Posts
Application Performance Monitoring Blog Foglight APM 105
Blueprint for HPC - Blog Blueprint for High Performance Computing 0
Custom Solutions Engineering Blog Custom Solutions Engineering 8
Data Security Data Security 8
Dell Big Data - Blog Dell Big Data 68
Dell Cloud Blog Cloud 42
Dell Cloud OpenStack Solutions - Blog Dell Cloud OpenStack Solutions 0
Dell Lifecycle Controller Integration for SCVMM - Blog Dell Lifecycle Controller Integration for SCVMM 0
Dell Premier - Blog Dell Premier 3
Dell TechCenter TechCenter 1,858
Desktop Authority Desktop Authority 25
Featured Content - Blog Featured Content 0
Foglight for Databases Foglight for Databases 35
Foglight for Virtualization and Storage Management Virtualization Infrastructure Management 256
General HPC High Performance Computing 227
High Performance Computing - Blog High Performance Computing 35
Hotfixes vWorkspace 66
HPC Community Blogs High Performance Computing 27
HPC GPU Computing High Performance Computing 18
HPC Power and Cooling High Performance Computing 4
HPC Storage and File Systems High Performance Computing 21
Information Management Welcome to the Dell Software Information Management blog! Our top experts discuss big data, predictive analytics, database management, data replication, and more. Information Management 229
KACE Blog KACE 143
Life Sciences High Performance Computing 9
OMIMSSC - Blogs OMIMSSC 0
On Demand Services Dell On-Demand 3
Open Networking: The Whale that swallowed SDN TechCenter 0
Product Releases vWorkspace 13
Security - Blog Security 3
SharePoint for All SharePoint for All 388
Statistica Statistica 24
Systems Developed by and for Developers Dell Big Data 1
TechCenter News TechCenter Extras 47
The NFV Cloud Community Blog The NFV Cloud Community 0
Thought Leadership Service Provider Solutions 0
vWorkspace - Blog vWorkspace 511
Windows 10 IoT Enterprise (WIE10) - Blog Wyse Thin Clients running Windows 10 IoT Enterprise Windows 10 IoT Enterprise (WIE10) 4
Latest Blog Posts
  • Dell Cloud Blog

    Announcing the Dell Hybrid Cloud System for Microsoft

    Last week Michael Dell and Satya Nadella announced the industry first integrated system for Hybrid Cloud at Dell World. At Dell we believe that the future of cloud is Hybrid and for those IT organizations and Services providers looking to rapidly deploy a cloud solution, we have a fully integrated and modular system that can be customized to meet your needs.

     

    For a few years now, Dell and Microsoft have been working closely on bringing the learnings of building and operating one of the largest public clouds to the data center. The goal to provide an Azure-like experience to Enterprise customers and Service providers. Last year, we unveiled Cloud Platform System (CPS) Premium that has revolutionized how customer deploy and operate a private clouds at a large scale. Over the past year, we heard your feedback and have built a mid-size hybrid cloud with the same principles as CPS premium, but more modular with the ability to start small and pay as you grow. The Dell Hybrid Cloud System for Microsoft CPS Standard is the second of the Cloud Platform System (CPS) family of products.

     

    Key features include:

    1. Fully integrated cloud stack with System Center and Windows Azure Pack
    2. Integrated multi-cloud management and Azure IaaS with Dell Cloud Manager
    3. Discretely scalable compute and storage blocks.
    4. Non-Disruptive and Sequenced Patch and Update process with and Microsoft and Dell updates tested, validated and packaged quarterly.
    5. Integrated Hybrid Cloud Integration for Azure Backup, Azure Site Recovery and OpsInsight.
    6. Dell Financial Services models to convert CAPEX to OPEX with models based around consumption and lowering risk.

     

    Our goal is to enable you to confidently adopt cloud in your data center with predictable results with a solution aiming to lower your risk for adoption, streamline operations and simplify supply chain.

     

    Goto dell.com/dhcs for more information and stay tuned for more blogs on this topic.

  • Dell Cloud Blog

    UX Case Study: Blueprint Versioning


    This is the final post in a series of User Experience (UX) topics on the Dell Cloud Blog. The first four topics were UX Culture at Dell Cloud ManagerThe Benefits of a UI Pattern Library, Docs Day: UX Tested, Engineer Approved, and Best-in-Class User Research and Persona Building. We look forward to sharing more UX strategy with you in the future!


    Dell Cloud Manager recently added a customizable catalog feature that allows admin-level customers to upload blueprints and make them easy to deploy by their end users. In the original feature, the user experience (UX) team added support for uploading blueprints through the user interface. This was in addition to the ability for users to upload through the API. We received great feedback on the catalog and upload capabilities, but one key use case that was not in the first release was the ability to track versions. Through continuous UX research, we learned that users could benefit greatly from the ability to maintain and track multiple versions of a single blueprint. For example, an administrator could test a new version before making it publicly available in the catalog. Also, if the blueprint administrator discovered a problem with a particular version, they could roll back to a previous, more stable version. This missing support for versions became our next goal for feature release. 

    A Lean Team Effort

    At Dell Cloud Manager, we use lean teams to quickly research, design, develop, and test a new feature by fully dedicating a cross-functional team to focus on a measureable goal. The blueprint versioning feature was a lean team effort that included representatives from UX, front-end engineering, back-end engineering, and product management. All of the participants were remote, and all were fully dedicated to minimize distractions. This allowed us to work very quickly and deliver the feature to our customers in record time. The UX team kicked off the collaboration by presenting an initial set of mockups that were reviewed and discussed with the entire team. We considered numerous options when deciding how the feature could work and continuously revisited—and even refined—our primary goal. Once we came to a consensus, all team members worked in parallel, each of us with a common vision for the feature.  

    User Studies

    Three business days after the initial kick-off meeting, the UX team ran a set of hour-long usability studies. The participants completed core tasks using an interactive prototype, developed in collaboration with front-end engineering. This prototype eventually became our final software implementation. Over the next week, we iterated on the UI design and ultimately “hooked it up” to the back-end engineering work.

     The usability studies validated our assumptions, as well as revealed areas where we could improve our design and implementation. For example, we identified a subtle labeling issue. Users were tripped up by a dialog button label named “Edit version.” In this part of the workflow, users had already made their edits and wanted to “Save”, not “Edit.”  We also found design and implementation gaps. Users were confused as they created a new version because there was no feedback that the version had been successfully created. The screen refreshed to the initial, default view, and users were left wondering if their changes had been saved. These issues were identified and were quickly fixed.

    The most significant finding of the usability study related to our capability set. We experimented with the idea of allowing users to edit their blueprints directly in Dell Cloud Manager. However, we realized that the inline editor we provided was competing with the user’s external versioning system. If a user edited within Dell Cloud Manager, their version would not be recorded in their preferred version control system, so we decided to remove this feature. On the surface it might seem like we reduced the capabilities of Dell Cloud Manager, but in reality, it clarified the capability, reduced confusion, and led to a superior user experience overall.  By removing the inline editor, there is no longer confusion about where a user should edit files. And, there is no question about where version control is performed and managed. Using Dell Cloud Manager, our users can see their versions and switch between them. Any number of external tools can, and should, be used alongside Dell Cloud Manager to create and manage versions.

    Lean Team Impact

    The blueprint versioning feature was designed, developed, tested, documented, and deployed in 4.5 weeks. The tight collaboration of UX, engineering, and product management made it possible. The entire team was focused on building the essential components to support the best user experience. From reviewing the initial mockups to iterating on the UI design as a result of the usability study, the UX team was able to take user feedback and keep the lean team focused on the needs of the end user.


    The Dell Cloud Manager User Experience Team welcomes your feedback and suggestions! If you’d like to join our research panel and contribute your voice to the development of Dell Cloud Manager, please visit: http://www.enstratius.com/support/usability.

  • Dell TechCenter

    Dell Hybrid Cloud System for Microsoft the future-ready alternative to DIY hybrid cloud

    DHCSSearch the internet for the phrase "DIY hybrid cloud" (do it yourself) and the results will vary from sponsored links for a multitude of different reference architectures, managed service offerings and even a few for building a hybrid cloud in your garage, all of which make the idea of DIY hybrid cloud sound easy.

    In 2013 Gartner predicted by 2017 nearly half of all large enterprises will have hybrid cloud deployments that does indeed seem to be the direction most enterprise customers want to go, but the journey to get there can be complex, difficult and painful. In fact, many analysts put the current level of hybrid cloud adoption below 20%.

    The business demands that once drove the increased adoption of virtualization are now also driving the move to adopt private, public and hybrid cloud solutions but many IT operations are trying to meet the demand from a virtualization up perspective with legacy limitations on agility, choice and governance.

    The increased demand for rapid adoption often pushes IT to deliver to an artificial deadline when they really need to take a step back and ensure that solutions for security, disaster recovery, operational efficiency, performance, scalability, capacity and financial planning are all in place.

    The critical needs for successful On-Demand Hybrid Cloud are very different from the virtualization model.

    - Cross platform orchestration and operational automation

    - Elastic, consumption based model and measurement

    - Pay-as-you-go funding

    - Self-service provisioning

    - Seamless datacenter extensions

    - Flexible workload mobility

    - Federated identity management

    - Hybrid application management across the lifecycle

    For businesses to realize the promises of efficiency, reduced costs and competitive advantages of cloud technology the adoption process needs to be easy, seamless, and non-disruptive for the IT department to plan, deploy, implement and manage;

    IT operations must be viewed as and truly become a broker of services, rather than being focused on buying the components of a virtual datacenter infrastructure, providing desktop and server support and administrating workloads.

    IT must become focused on delivering value through automation of process, quality of service and driving innovation, while also managing on premise and off premise cloud environments. 

    Enter the Dell Hybrid Cloud System for Microsoft

    The Dell Hybrid Cloud System for Microsoft is about operational efficiency, not technical infrastructure. It is a fully integrated system that provides on demand self-service with resource consumption and scalability on demand, 24/7/365 availability, archiving, backup, recovery, security and automated failover.

    Dell and Microsoft minimize risk and break through the barriers of DIY hybrid cloud by providing a well-engineered and fully integrated turn-key system making hybrid cloud much easier to adopt, deploy, implement, and manage over an entire lifecycle.

    Together Dell and Microsoft are providing the vehicle for customers to rapidly accelerate or gradually pace themselves from a virtual datacenter infrastructure, existing private or public cloud to a much more agile, controllable and future proof hybrid cloud.

    The Dell Hybrid Cloud System for Microsoft helps customers extend their data center beyond current boundaries without forcing them to “cloud everything" as they strive to meet the demands of their end users and customers.

    Before you search the internet for “DIY hybrid cloud” watch this Dell Hybrid Cloud System for Microsoft video.

    The video briefly tells the story of the difficulties of designing, building, configuring, implementing and maintaining a “Do It Yourself Hybrid Cloud” compared to the benefits of the Dell Hybrid Cloud System for Microsoft the simple and fast way to get the best of private and public cloud in one easy to deploy, easy to manage, hybrid cloud solution.

    • The benefits include but are not limited to;
    • Scale-Ready Payment Solutions
    • Azure and Azure Services for Backup, Site Recovery and Operational Insights to protect and control your cloud.
    • Automated, non-disruptive and dependency-aware updates that save time and improve IT responsiveness Scalability and elasticity of the public cloud with the control and security of private cloud
    • DHCS takes less than 3 hours to deploy from crate to cloud

    Then read this blog by Glenn Keels Executive Director, Product Management and Marketing, Dell Engineered Solutions and Cloud and then visit www.dell.com/DHCS and start your own future-ready journey to hybrid cloud.

  • Information Management

    How to Integrate Big Data in the Classroom and Keep It Private

    When you were in school, did you ever look up at your teacher and think, “There’s someone who knows data analytics?”

    Sure, teachers somehow manage to size people and situations up pretty quickly. So maybe you looked at your teacher and thought, “How did she figure out who shot the rubber band into the ceiling tile?” or “How did he know Bill and Marco were chewing gum?”

       Click to enlarge

    But you probably didn’t associate your teacher with big data and analytics.

    3 ways to integrate big data into the K-12 classroom

    In my last post, I mentioned that education generates mountains of data, but teachers rarely have the analytical tools to work with the variety and volume of that data. That’s changing with the advent of learning analytics, which applies predictive analytics to improve education for all students.

    A new report from THE Journal, called Game Changer: How Predictive Analytics is Transforming K-12 Education, highlights three tips from the New American Foundation for successfully integrating big data into the K-12 classroom:

    1. Provide professional learning opportunities during the normal workday. This is the only way to keep up with the students, who learn more about technology while gabbing with friends during recess than most teachers learn in a week.
    2. Select trainers who are both knowledgeable data analysts and effective, capable coaches. That may be a tough combination to find, but the ability to convey techniques is as important as the techniques themselves.
    3. Embrace the power of data from the top down. Teachers using analytics is good; teachers and administrators using analytics is better. Data analytics has the potential to fit smoothly with the overall culture of the school or district.

    5 ways to manage student information in the classroom

    If you’ve been in school recently or have children there, have you thought about the amount of data that a student generates? Assignments, grades, standardized test scores, attendance, health, financials and personal data are just the start. Then there’s the student’s online footprint: websites, passwords, posts, comments and uploaded/downloaded documents.

    Who’s protecting all of that information and keeping it private? Big data shapes the way in which students look at privacy. As they generate, capture and interact with data, they begin to recognize the importance of privacy and data security.

    The Game Changer report I mentioned includes five suggestions from the National Center for Education Statistics for managing information in student education records:

    1. Identify the elements of personally identifiable information (PII) that need to be protected; for example, student’s Social Security number, date of birth and mother’s maiden name.
    2. Confirm the need to maintain PII, then collect only relevant, necessary PII.
    3. Ensure that PII is accurate, timely and complete.
    4. Identify the risk level associated with different types of PII.
    5. Implement internal procedural controls to protect the PII.

    Breaches of financial, commercial, *** and government data grab the headlines, but there is plenty of big data in education, too. How do you think it’s affecting student privacy? Let me know in the comments below.

  • Application Performance Monitoring Blog

    Interacting with trace data

    Back in March, Cameron Haight posted a blog about APM vendors needing to bring some joy to the market. 

    As we have been developing the Foglight APM SaaS Edition, we have been focused on simplifying the APM end-user experience and finding novel ways of enabling users to discover the information hidden in their APM data. I have heard nice feedback about the Investigate portion of the product and how it lets users interact with their raw trace data in something way more than a clinical way.

    For those of you not familiar yet with the Investigate UI, it is designed to give advanced performance engineers access to raw trace data. We set out to solve a tackle a challenging problem with this feature—how to help users visualize and understand the impact of multiple dimensions on application requests. As my co-worker Steve Fox pointed out (about the 26:30 point in the video) in a great Velocity session, graphs and charts are limited to 3 dimensions of data. So, when people want to look at more than 3 dimensions, they have to do things like page back and forth between different views, trying to hold the context of the previous view, or craft their own portal style view with custom, but unlinked 2 and 3 dimension charts where they have to infer the relationships across the charts.