Dell Community

Blog Group Posts
Application Performance Monitoring Blog Foglight APM 105
Blueprint for HPC - Blog Blueprint for High Performance Computing 0
Custom Solutions Engineering Blog Custom Solutions Engineering 8
Data Security Data Security 8
Dell Big Data - Blog Dell Big Data 68
Dell Cloud Blog Cloud 42
Dell Cloud OpenStack Solutions - Blog Dell Cloud OpenStack Solutions 0
Dell Lifecycle Controller Integration for SCVMM - Blog Dell Lifecycle Controller Integration for SCVMM 0
Dell Premier - Blog Dell Premier 3
Dell TechCenter TechCenter 1,858
Desktop Authority Desktop Authority 25
Featured Content - Blog Featured Content 0
Foglight for Databases Foglight for Databases 35
Foglight for Virtualization and Storage Management Virtualization Infrastructure Management 256
General HPC High Performance Computing 227
High Performance Computing - Blog High Performance Computing 35
Hotfixes vWorkspace 66
HPC Community Blogs High Performance Computing 27
HPC GPU Computing High Performance Computing 18
HPC Power and Cooling High Performance Computing 4
HPC Storage and File Systems High Performance Computing 21
Information Management Welcome to the Dell Software Information Management blog! Our top experts discuss big data, predictive analytics, database management, data replication, and more. Information Management 229
KACE Blog KACE 143
Life Sciences High Performance Computing 9
OMIMSSC - Blogs OMIMSSC 0
On Demand Services Dell On-Demand 3
Open Networking: The Whale that swallowed SDN TechCenter 0
Product Releases vWorkspace 13
Security - Blog Security 3
SharePoint for All SharePoint for All 388
Statistica Statistica 24
Systems Developed by and for Developers Dell Big Data 1
TechCenter News TechCenter Extras 47
The NFV Cloud Community Blog The NFV Cloud Community 0
Thought Leadership Service Provider Solutions 0
vWorkspace - Blog vWorkspace 511
Windows 10 IoT Enterprise (WIE10) - Blog Wyse Thin Clients running Windows 10 IoT Enterprise Windows 10 IoT Enterprise (WIE10) 4
Latest Blog Posts
  • General HPC

    The Democratization of Genomics Continues: How Health IT Professionals Can Enable Genomic-Driven Precision Medicine

    by Seth Feder

    Genomics is no longer solely the domain of university research labs and clinical trials. Commercial entities such as tertiary care hospitals, cancer centers, and large diagnostics labs are now sequencing genomes. Perhaps ahead of the science, consumers are seeing direct marketing messages about genomic tumor assessments on TV.  Not surprising, venture capitalists are looking for their slice of the pie, last year investing approximately $248 million in personalized medicine startups. 

    So how can health IT professionals get involved? As in the past, technology coupled with innovation (and the right use-case) can drive new initiatives to widespread adoption. In this case, genomic medicine has the right use-case and IT innovation is driving adoption.   

    While the actual DNA and RNA sequencing takes place inside very sophisticated instrumentation, sequencing is just one step in the process. The raw data has to be processed, analyzed, interpreted, reported, shared, and then stored for later use.  Sound familiar?  It should, because we have seen this before in such fields as digital imaging which drove the wide spread deployment of Picture Archiving and Communicating Systems (PACS) in just about every hospital and imaging clinic around the world.  

    As in PACS, those in clinical IT must implement, operationalize, and support the workflow. The processing and analysis of genomic data is essentially a big data problem, solved by immense amounts of computing power.  In the past, these resources were housed inside large exotic supercomputers only available to elite institutions. But today HPC built on scale-out x86 architectures with multi core processors have made this power attainable to the masses – and thus democratized.  Parallel file systems that support HPC are much easier to implement and support, as are standard high bandwidth InfiniBand and Ethernet networks. Further, public cloud is emerging as a supplement to on-premise computing power.  Some organizations are exploring off-loading part of the work beyond their own firewall, either for added compute resources or as a location for long term data storage.

    For example, in 2012 myself and others at Dell worked with the Translational Genomics Research Institute (TGen) to tune its system for genomics input/output demands by scaling its existing HPC cluster to include more servers, storage and networking bandwidth. This allowed researchers to get the IT resources they needed faster without having to depend on shared systems. TGen worked with the Neuroblastoma and Medulloblastoma Translational Research Consortium (NMTRC) to develop methodology for fast sequencing of childhood cancer tumors, allowing NMTRC doctors to quickly identify appropriate treatments for young patients. 

    You can now get pre-configured HPCs to work with genomic software toolsets, which enabled clinical and translational research centers like TGen to do large-scale sequencing projects. The ROI and price per performance is compelling for anyone doing heavy genomic workloads.  Essentially, with one rack of gear, any clinical lab now has all the compute power needed to process and analyze multiple genome sequences per day, which is a clinically relevant pace. 

    Genomic medicine is here, and within a few years will become standard care to sequence many diseases in order to determine proper treatment.  As the science advances, the HPC community will be ready contribute in making this a reality. You can learn more here.

     

  • Dell Big Data - Blog

    The Democratization of Genomics Continues: How Health IT Professionals Can Enable Genomic-Driven Precision Medicine

    by Seth Feder

    Genomics is no longer solely the domain of university research labs and clinical trials. Commercial entities such as tertiary care hospitals, cancer centers, and large diagnostics labs are now sequencing genomes. Perhaps ahead of the science, consumers are seeing direct marketing messages about genomic tumor assessments on TV.  Not surprising, venture capitalists are looking for their slice of the pie, last year investing approximately $248 million in personalized medicine startups. 

    So how can health IT professionals get involved? As in the past, technology coupled with innovation (and the right use-case) can drive new initiatives to widespread adoption. In this case, genomic medicine has the right use-case and IT innovation is driving adoption.   

    While the actual DNA and RNA sequencing takes place inside very sophisticated instrumentation, sequencing is just one step in the process. The raw data has to be processed, analyzed, interpreted, reported, shared, and then stored for later use.  Sound familiar?  It should, because we have seen this before in such fields as digital imaging which drove the wide spread deployment of Picture Archiving and Communicating Systems (PACS) in just about every hospital and imaging clinic around the world.  

    As in PACS, those in clinical IT must implement, operationalize, and support the workflow. The processing and analysis of genomic data is essentially a big data problem, solved by immense amounts of computing power.  In the past, these resources were housed inside large exotic supercomputers only available to elite institutions. But today HPC built on scale-out x86 architectures with multi core processors have made this power attainable to the masses – and thus democratized.  Parallel file systems that support HPC are much easier to implement and support, as are standard high bandwidth InfiniBand and Ethernet networks. Further, public cloud is emerging as a supplement to on-premise computing power.  Some organizations are exploring off-loading part of the work beyond their own firewall, either for added compute resources or as a location for long term data storage.

    For example, in 2012 myself and others at Dell worked with the Translational Genomics Research Institute (TGen) to tune its system for genomics input/output demands by scaling its existing HPC cluster to include more servers, storage and networking bandwidth. This allowed researchers to get the IT resources they needed faster without having to depend on shared systems. TGen worked with the Neuroblastoma and Medulloblastoma Translational Research Consortium (NMTRC) to develop methodology for fast sequencing of childhood cancer tumors, allowing NMTRC doctors to quickly identify appropriate treatments for young patients. 

    You can now get pre-configured HPCs to work with genomic software toolsets, which enabled clinical and translational research centers like TGen to do large-scale sequencing projects. The ROI and price per performance is compelling for anyone doing heavy genomic workloads.  Essentially, with one rack of gear, any clinical lab now has all the compute power needed to process and analyze multiple genome sequences per day, which is a clinically relevant pace. 

    Genomic medicine is here, and within a few years will become standard care to sequence many diseases in order to determine proper treatment.  As the science advances, the HPC community will be ready contribute in making this a reality. You can learn more here.

     

  • Dell TechCenter

    Dell Software Recognized as a 2015 Confirmit ACE Award Winner


    Confirmit names annual award winners that best demonstrate how to use the Voice of the Customer to drive business results

    Dell Software, May 13, 2015 – Dell Software has been awarded a 2015 Confirmit ACE (Achievement in Customer Excellence) Award for the sixth consecutive year. The accolade demonstrates Dell’s long-term commitment to providing its global customer base with unparalleled customer service.

    The Confirmit ACE Awards program celebrates outstanding achievement in Voice of the Customer and customer experience. Receiving a Confirmit ACE Award is a distinct honor that demonstrates the recipient’s rigorous application of customer experience processes, and its outstanding performance as measured by those processes. An elite group of Confirmit clients qualified for an ACE Award.

    Dell earned the Confirmit ACE Award based on their ability to monitor and implement support improvements based on customer feedback in order to provide the best customer experience.

    “We are delighted to be a Confirmit ACE award winner,” said Sean McEvoy, Worldwide Leader, Support, Dell Software. “We are committed to providing the best customer experience possible, and this award demonstrates that we are delivering on that promise. As we work to continually improve our world-class support program, input from our customers and partners is critical, and we consider Confirmit an important partner in our program’s continued success.”

    "Dell has proven to be a true leader in customer excellence. Its comprehensive program ensures that the Voice of the Customer is not just part of the business process, but built into the fabric of the company to improve business results and drive change,” said Henning Hansen, President and CEO of Confirmit.  “We are proud that Dell collaborates with Confirmit for its customer experience initiatives.”

     

  • Information Management

    Extending the Value of SCOM: Do You Provide the Best Possible Healthcare for Your SQL Server Environment?

     I like to think of SCOM as a great hospital, designed to bring together the best facilities and professionals to deliver the highest level of healthcare to its patients. Just like a hospital, SCOM provides a framework, infrastructure and delivery of the basic healthcare services you would expect. And, just like a hospital, SCOM can’t possibly cover all of the specialities to span the full spectrum of issues that could crop up.

    SCOM can be customised to a specific domain, but in the heat of an emergency event, you need to rely on the best in class dedicated resources that can deliver the fastest possible diagnostics and the most reliable root cause analysis. Should trouble with a database occur—and odds are about 100% that it will happen over time—DBAs just aren’t getting what they need to effectively troubleshoot and diagnose performance issues. Think of it this way: your primary care physician can spot a problem, but you’ll likely need a specialist to resolve it and prevent it in the future.

    To get the right specialized care for your SQL Server health, it’s important to first understand the areas where SCOM doesn’t fully address your database needs:

    • Day-to-day monitoring ― Problem-solving requires more than a heads-up message. What’s needed is real-time information about performance against baselines, current trends, current top resource utilization and workload, which enables deep workload analysis.
    • Database health root-cause analysis ― When incidents occur, time is of the essence. Turning to custom code or another tool to troubleshoot the problem isn’t efficient. What’s necessary is the ability to quickly review a deadlock event and compare the performance of the server for symptoms of an issue, which offers the information you need for correct diagnosis and repair.
    • Query performance With SCOM alone, you have no visibility into query performance. Having that sightline allows for multi-dimensional analysis of performance.
    • Troubleshooting real-time and historic issues As mentioned, problems will happen, and being reactive only allows you to deal with a problem. Having a handle on database performance allows you to proactively address important issues, such as the day-to-day utilisation of servers, which applications use the most disk space and which applications use the most memory.
    • Change tracking Many times, the development team writes and deploys code with only basic unit and integration testing. Users need to be able to identify changes to server or database configuration and to SQL plans and database objects, and align them with performance.
    • Comparison of performance SCOM-only environments don’t allow you to compare application performance over time or against performance measures. Such analysis can uncover hidden performance demons.

    With SCOM alone, DBAs would potentially need to write complex scripts to create a picture of server health, leaving them short of all the information necessary to troubleshoot issues.  Having complementary solutions that assist in the diagnosis and solutions of problems drastically enhances SCOM’s capabilities. The end result is a great partnership to optimize the health of your SQL Server environment and deliver the great performance that your end users have come to expect.

    Want more information on how to maximize your SCOM investment? Read this white paper, “Go Beyond Basic Up/Down Monitoring.” You’ll get expert guidance to extend SCOM’s capabilities with predictive performance diagnostics and best practices for monitoring and troubleshooting your SQL Server environment.

     

  • Statistica

    Thought Leaders in the Mix - June 2015

    Our subject matter experts for Statistica and the Information Management Group (IMG) keep busy, staying abreast of current trends with big and small data, predictive software and real-world analytics solutions. And they frequently comment in industry publications, professional forums and blogs. Here are a few of their recent articles.
     
    Dell Information Management Group John Whittaker Analytics in Healthcare: Q&A with Dr. Charlotte Hovet, Part 1
    by John Whittaker, executive director of product marketing

    In this interview, Dr. Charlotte Hovet, medical director of Dell’s Global Healthcare Solutions, shares her thoughts about how healthcare informatics and predictive analytics are helping to usher in a new era of wellness and disease prevention.

     

     

    Dell Software Group Shree Dandekar Connected Cows and the Evolution of Agriculture IoT
    by Shree Dandekar, executive director of product management (analytics)

    The benefits of agriculture IoT sound enticing, but there are architectural challenges to be addressed before deciding on a solution that turns 6000 head of cattle into a data powerhouse. Dandekar describes Dell's successful case at Chitale Dairy.

     

     


    Dell Information Management Group Joanna Schloss
    Why companies can't afford to go overboard with analytics
    by Joanna Schloss, business intelligence and analytics evangelist

    While advanced analytics is a critical component to the success of an organization, Schloss outlines in her latest CMSWire article the drawbacks of excessive analysis and the benefits of focusing on innovation rather than optimization.

     

     

     

  • vWorkspace - Blog

    Feature spotlight: vWorkspace 8.6 - infrastructure internationalization (i18n) and connector localization (l10n)

    In Wyse vWorkspace 8.6 (available now) there are substantial changes to the infrastructure and connectors as it pertains to supporting environments that use non-Roman alphabets, unicode and DBCS (double-byte character set)

    Infrastructure internationalization or i18n - Internationalization for vWorkspace means adding support for unicode characters that exist in languages such as Chinese, Japanese and Korean.  The purpose of the internationalization effort is to allow customers to install vWorkspace into and run vWorkspace in non-Roman alphabet environments such as (but not limited to):

    • Microsoft Active Directory - for displaying Unicode computer and user accounts, OUs, domain names...
    • Microsoft Windows Server - support for installing any vWorkspace roles that run on Windows Server
    • Microsoft SQL Server - for the vWorkspace configuration and monitoring and diagnostics databases

    This also means that the following also support Unicode characters:

    • vWorkspace resource names; computer groups, session host folders, managed applications...
    • vWorkspace installers
    • vWorkspace control panel applets

    Infrastructure internationalization does not mean displaying the menus of the vWorkspace and WSM management consoles in non-Roman alphabets, but the vWorkspace and WSM management consoles can display unicode content, like user names, computer names..... A localized management console is planned for a release after 8.6.

    Connector localization or l10n.  - Localization of the vWorkspace connectors means that the installers and user interface menus shall be available in, or support multiple languages, such as English, Simplified Chinese, Japanese and Korean.  Additional languages are planned for release later in the year.

    • vWorkspace connectors are the end user software that is used to connector to remote applications and virtual desktops
    • The following vWorkspace connectors shall be localized; Windows, Mac, Linux, iOS, Android and Chrome (new for 8.6). Because the Web Access user interface is localized, the HTML5 and Java connectors are also localized.
    • Localized vWorkspace connectors may be installed on operating systems that use non-Roman alphabets, such as Chinese (simplified), Japanese and Korean (additional languages to follow).

    In subsequent feature spotlights I will write about the new vWorkspace connector for Google Chrome, as well as the simplified configuration and user interface for all vWorkspace connectors (shown above).

  • Information Management

    You Now Have More Access To Advanced Analytics, But At What Cost?

    Big data has made big strides in recent years. Specifically, more organizations than ever before are leveraging data and information to deliver actionable and valuable business insights. While big data problems of the past have centered on making sure infrastructure could keep up with how much data is being pulled, significant advancements in storage and other infrastructure technologies have given us a firm foundation on which companies can deploy their predictive models.

    Thus far, 2015 has provided new opportunities to bring analytics directly to business users, but with it, challenges now go beyond what’s in the datacenter alone. These opportunities and challenges have already begun to present themselves and organizations are learning to address them in the following ways:

    Opportunity: Enterprises are using existing technology with big data platforms to deliver ROI

    While the analytics space has historically been crowded with BI, dashboarding and other tools, more enterprises have begun to use new platforms with existing analytics programs to unlock business value. To begin with, enterprises are looking for ways to incorporate data visualization with data analytics solutions to more easily interpret vast amounts of unstructured data. While the interpretation challenges still remain, those who apply visualization solutions map out meaningful insights everyone from non-technical executives to data scientists can read more effortlessly.

    One of the ROI-achieving byproducts of visualization and analytics is that insights now become more accessible to a wider user base. With BI vendor offerings becoming increasingly easy to operate, business-minded users who might not have the background to use traditional systems are finally able to leverage data analytics to create new revenue streams. In doing so, they’re able to deliver better customer experiences and expand into new markets.

    Challenge: Self-service and automated decision-making are influencing businesses to reorganize 

    While the demand for candidates skilled at interpreting data still surpasses the supply, companies are coping with this shortage by investing in self-service, automation and augmentation platforms. Ultimately, organizations are leveraging automated decision-making and data discovery tools for improved cost and efficiency. At the same time, they must be prepared to significantly restructure to achieve competitive advantages like using data to proactively cross-sell and up-sell. Many operational processes now can be completely executed automatically with data analytics. When adopting programs that automatically push successful predictive models straight to the data, organizations should spend time checking the source to ensure the usefulness and relevance of tried-and-true models. While automation can enable real-time analytics, resources still should be allocated toward making sure current models are the best ones to use.  

    Opportunity: The growth of Information as a Service (IaaS) is providing easier access to analytics

    There is a steady movement from simple, backward-looking descriptive analytics to advanced analytics that predict outcomes, and prescribe a course of action. This creates more opportunities to democratize access to analytics. One option that is emerging as a result of both this movement and the rising popularity of “as-a-service” delivery models is “information-as-a-service” or IaaS. The availability of IaaS further breaks the barrier to entry for businesses that historically have not had the technology, finances or skills to leverage advanced analytics, as well as provide them with an additional competitive edge to bolster growth.

    Challenge: You’ll find network and security challenges at the intersection of big data and IoT

    The growth of connected endpoints is making more information available for extracting insight. This, in and of itself, has driven both the growth of IoT, as well as the need for analytics. With more data, however, there is more exposure to such vulnerabilities as cybersecurity threats, compliance issues and other risks. Although this creates a market opportunity for vendors offering integrated solutions covering a comprehensive list of data analytics, endpoint management, threat detection and compliance needs, the reality for IoT organizations is that there is already a struggle not only to mine new data pools, but also to securely store the data.

    Organizations continue to have an opportunity to benefit from advanced analytics, as access to data only gets easier and making sense of it simultaneously grows less complex. This provides opportunity for people outside the data scientist profile – from business users who need analytics to solve a marketing problem, to small businesses that, yesterday, couldn’t afford to invest in the time or costs associated with pulling insights from their data. While other challenges have emerged and will continue to do so, improved accessibility has opened a huge window of opportunity that help businesses use their data to spike competitive advantage.

  • vWorkspace - Blog

    Dell Wyse vWorkspace v 8.6 Beta

    Hello,

    The Dell Cloud Client-Computing group has released a public beta of Wyse vWorkspace version 8.6. For many years vWorkspace has been a fully featured virtual desktop delivery platform that has matured with each release. Wyse vWorkspace supports VMware, Microsoft and Odin (formerly Parallels)  as the compute engines for hosting virtual desktops, but it also brokers connections to RDSH sessions. This provides a level of unparalleled flexibility as well as a level of simplicity you wouldn't expect from such a feature-rich solution. Over the years we have added many features to vWorkspace to streamline VDI deployments.  For example, in 2012 we added Hyper-V Catalyst, a caching mechanism that improves the I/O performance of your virtual desktops storage, both for provisioning of virtual desktops and for decreasing their storage footprint. The following year we added Hyper-V Catalyst support for Terminal Servers so you could build RDSH hosts from a golden image leveraging Microsoft differencing disks. Last December, we also integrated Wyse WSM into vWorkspace Premier licensing (a feature now called vWorkspace WSM), allowing to deliver desktop images, layered applications and server imaging separately to highly-secure stateless and diskless clients on demand from the cloud.

    With the new 8.6 release, we are adding the following features. Some of them are house-keeping features, while others are advancing features such as high availability of hyper-V hosts. 

    We would like to invite you to join us in sampling the latest features and improvements.

    • Internationalization
      • Added Unicode support for vWorkspace core products such as the Connection Broker and Web Access as well as the connectors.
    • Localization
      • Localized connectors for Chinese, Japanese, Korean.
    • WSM
      • Layering support via PowerShell
        • Integrate virtual layers into a vWorkspace virtual desktop to separate departmental applications from the virtual desktop's operating system.
      • User Profile Layering
        • Captures user profile data in a layer that can be accessed from any device.
      • Hot High Availability
        • Allows client streamed by WSM to seamlessly switch to a different WSM server in the event of failure.
      • Centralized Site Management
        • Provides single management UI for all WSM Sites.
    • Advanced Hyper-v integration
      • Full clone provisioning with Hyper-v
        • This direct integration with Windows Server Hyper-V adds support for provisioning of full, persistent virtual machines where previous versions of vWorkspace HyperDeploy only supported differencing disk clones.
      • High availability of Hyper-V hosts
        • When using a Hyper-V cluster as the compute engine for your vWorkspace virtual desktops you have the option to set the virtual machines to be available across the cluster (High Availability).
    • vWorkspace Connectors
      • Connector for Google ChromeOS
        • This connector for ChromeOS provides a more native user experience than HTML5 and shall be available via the Google Chrome Web Store.
      • Simplification
        • Complete streamline of the vWorkspace Connectors' user interface to improve end user ease of use.
      • Email-based configuration
        • Only thing user needs to know to obtain a configuration is their email and password.  
      • 64-bit
        • Support for 64-bit Remote Desktop extensions (virtual channels)
    • vWorkspace Connector for Windows – drag and drop
      • Allows authorized users to drag files and content from their Windows PC and drop them into vWorkspace desktops and applications.  This simplifies common workflows. For example; insurance agents can now drag files directly into the correct claim record of their hosted CRM application. This functionality is not available in the Windows native Remote Desktop Connection.
    • Monitoring and diagnostics
      • Monitoring of Microsoft Remote Desktop Licensing Service infrastructure
      • Monitoring of vWorkspace WSM infrastructure

    If you have previously signed up for the Beta you will receive an email containing a registration link. If you have not you can fill out the registration form using this link https://dell.getfeedback.com/r/tUNGB0oX  

    Once you have registered, you will be provided a link to the vWorkspace beta community. The link will direct you to a private community where you will need to request access to the group.  If you do not have an account you will be provided the option to create one. Once your account is created, and access to the group has been granted, you will have access to the Beta installation download and documentation.

     If you have any questions regarding the program you can send me an email at kelly_craig@dell.com or to vWorkspace_beta@software.dell.com.

    Documentation

    Documentation can be found in the Download section of the beta community group

    Support

    Support for the Beta is provided through the group forum.

     

    With regards,

    Kelly Craig 

    vWorkspace Product Management

  • KACE Blog

    You Need a Systems Management Solution that Works Hard So You Don't Have To

    Educational institutions are always looking for ways to enhance student learning, and to that end, many are adopting or expanding their digital curricula. That requires investing in computing infrastructure — but it doesn’t have to mean overwhelming your IT staff or breaking the budget. The key is to abandon the old manual, time-consuming and error-prone methods for systems management tasks and find a systems management solution that will streamline and automate tasks like imaging machines, tracking hardware and software inventory, installing software and updates, managing troubleshooting workflows, enforcing security policies, and reducing energy usage.

    Dell KACE solutions enable you to efficiently manage the complete systems lifecycle, from deployment to retirement, empowering your IT team to gain efficiencies, cut costs, enhance security and focus on more strategic projects. In short, they work hard so you don’t have to.

    The Dell KACE portfolio is comprehensive, providing initial operating system deployment, application distribution, patch management, asset management, endpoint security and service desk functionality. Dell KACE solutions support all popular operating systems and can manage all of your computing devices enterprise-wide. Plus, you have the flexibility to implement the Dell KACE solutions in a way that best fits your environment and resources — as on-premise physical appliances that plug into your environment or as on-premise virtual appliances that can run on your own servers. One solution (the Dell KACE K1000) is even available over the cloud as a hosted virtual appliance.

    Even better, Dell KACE solutions are easy to deploy and use, and they offer a low total cost of ownership (TCO). With a simple plug-and-play architecture that virtually eliminates installation and maintenance, along with scalability to meet the needs of your growing organization, Dell KACE solutions are designed to be both immediately productive and trouble-free for the long term, saving you time and money.

    To learn more about how the Dell KACE portfolio can streamline and automate systems management talks to support digital learning at your school or college, be sure to read our whitepaper, “Solving Systems Management Challenges for Education.”

    Lolita Chandra

    About Lolita Chandra

    Lolita is a Product Marketing Manager for Dell KACE. She has over 10 years of product marketing experience with IT software and infrastructure-as-a-service solutions.

    View all posts by Lolita Chandra

  • Dell Big Data - Blog

    Unlocking Competitive Advantages in Manufacturing Using Big Data

    by Armando Acost


    Increasingly, the manufacturing industry is seeing big data as an important tool to unlocking insights that drive competitive advantage. This is leading to greater adoption of big data strategies and solutions by a wide range of companies in every vertical. In a series of overview stories, insideBigData examined how the manufacturing industry is using big data and the success that is being realized.

    For example, some of the multitude of benefits manufacturers are seeing from big data include:

    • Measuring compliance and traceability down to the machine level.
    • Quantifying daily production activity with a company's financial performance.
    • Realizing higher degrees of visibility into supplier quality and performance over time.

    It is pointed out that there are some significant changes. According to insideBigData, "Many manufacturers don’t have access to the tools that they need for deploying big data solutions - to get results faster, to perform required calculations - regardless of what they’re manufacturing."

    Some of the challenges shared by many organizations as they begin a big data initiative include:

    • Handling data quality and performance - A big data infrastructure must align with business goals, to deliver actionable, real-time insights.
    • Expanding the data handling strategy - The ability to analyze a variety of data from traditional structured to semi- and unstructured data sources is paramount to successful data handling..
    • Identifying scalable big data solutions - Companies need flexible, scalable big data infrastructure that can integrate front-end and back-end systems, and grow with a company's needs.

    You can learn more about how manufacturers can benefit from big data, how the industry is adopting the technology, and read a case study in this white paper.