Our community is talking about the new Dell Technologies. Join the discussion in the Dell EMC Community Network:
This blog post is authored by Vijay Kumar TS, Project Manager Dell Hypervisor Engineering.
Did you know – Dell VMware supports documents are posted at VMware GA and these documents gives you elaborative information on getting started with VMware in Dell PowerEdge servers. These documents are updated periodically to help you with the latest changes, following are the documents which are posted at support.dell.com
1. VMware vSphere On Dell PowerEdge Systems Deployment Guide: This document helps you to deploy VMware ESXi on Dell PowerEdge servers and provides specific information about recommended configurations, best practices, and additional resources
2. Dell PowerEdge Systems Running VMware vSphere Getting Started Guide: This document helps you in setting up your Dell PowerEdge system running VMware vSphere for the first time
3. VMware vMotion and 64-bit Virtual Machine Support For Dell PowerEdge Systems: This document helps to find the compatibility and support for Dell PowerEdge Systems w.r.t VMware vMotion
4. VMware vSphere on Dell PowerEdge and Storage Systems: This document provides information about Dell-supported systems and Dell storage arrays compatible with VMware ESXi
5. Dell VMware ESXi Image Customization Information: This document provides information about the Dell Customized image from the base VMware image
6. Upgrade Guide to ESXi Using Dell Customized Image: This document provides information for upgrading the Existing OS using Dell Customized image
7. VMware Virtual SAN Product Information Guide: The objective of this document is to assist in the deployment of VMware® Virtual SAN on Dell PowerEdge Servers
8. VMware vSphere On Dell PowerEdge Systems - Release Notes: This document provides information on all the Known Issues with respect to VMware ESXi & Dell PowerEdge Servers.
This blog post is authored by Vijay Kumar TS, Project Manager Dell Hypervisor Engineering
What's new in VMware vSphere 5.1 Update 3
Please refer to the release notes that describes the updates in this release.
This is a MANDATORY hotfix and can be installed on the following vWorkspace roles -
NOTE: It is important that the Instant Provisioning tool be installed after Virtual Desktop Extensions (PNTools) has been installed on the template.
This release provides support for the following -
Instant Provisioning takes up to 4 minutes longer than in previous versions.
Instant Provisioning does not execute the custom scripts in the VBScripts folder.
Reprovisioned VMs running on VMware vSphere 5.5 do not retain their MAC address.
Managed Applications assigned to Active Directory groups or OUs will not appear when using Deferred Authentication or Managed Domains.
This hotfix is available at https://support.software.dell.com/vworkspace/kb/146740
As the New Year descends upon us, I’m excited to see that the promise of cloud technology has come to fruition. What am I basing that on? Well, recently, a doctor friend of mine commented about how she could easily solve her problem by storing data in the cloud.. Don’t you just love it when a complex technology environment can be summarized in a single, eight word comment? For me, it means that the use cases and the technology have evolved to a point where mass adoption is occurring. Technology vendors call that safe opportunity. As clouds become ever more pervasive, what does that mean for the analytics that are driven from these “hosted environments?” I see several trends emerging with the maturation of the cloud platforms and analytics.
For example, SalesForce.Com (SFDC) is only too happy to help with this new business opportunity of delivering analytics. The idea here is that SFDC creates your analytics for you, and you simply consume the analytics in the cloud. The benefits here are obvious – business users have a consistent view of the analytics, the analytics are readily available and stay up to date,, and the cost could be very reasonable. As always though, with all good things there are limitations; in this case, SFDC creates the analytics, meaning the business is restricted to and dependent on the algorithms and analysis that SFDC found relevant and interesting. A second shortcoming to this model is that with greater adoption, there may be more interest, and as with anything else, more volume means more cost incurred by the customer. Finally, you don’t get to interact with the analysis as your own – you are only borrowing the analytics, similar to the way you’d borrow a book from the library. Lastly, all of SFDC clients have access to the same analytics you are being served, making differentiation harder to come by. Many companies might prefer analytics delivered that are unique to their organization.
The next option is to have people host your analytics for you. This is subtly different from the subscription model here you rent space, the data, and the application to create these more customized analytics. Hosted analytics are very popular for many of the same reason as subscription analytics. It has the added benefit though of enabling you to massage or mash the data in ways that are unique to your business. In addition, your own business analyst has access to this dedicated environment. The downside, of course, is that the analytics live outside your firewall. In addition, the cost can be large if you have many users, and providers often don’t allow you to take the data inside your own firewall so if you want to combine it with your own corporate data. Instead, your corporate data has to be up loaded to the hosted site, opening up a sea of potential of governance issues.
The traditional path to analytics is one where companies create their own analytic warehouse with an analytic server and tools to support this environment. This approach has many advantages, especially since the tools that are available on premise are powerful and inventive. The downside tends to be cost (both in capital and maintenance) and agility. Traditional on-premise analytic environments require hardware, software, and services. The other downside revolves around supporting these systems – resources are required, support must be given in a timely fashion, and all this work falls on corporate IT, which is often times already overburdened with requests. Business analysts have many requests to access, change, or deliver on data, reporting, or analytic requirements. These tend to queue up and create latency and delivery problems in IT. Last but not least, the data environment that is required to support complete and meaningful data is often housed in a myriad of places – i.e., many different data types and platforms. Data that is found in a hosted application often cannot be brought down from the vendor to the client, and even if the data can be moved, synchronization and data lineage become big issues that drive the inaccuracy often found in analytics.
Hybrid Analytics –Future Vision
Some in the cloud, some in premise. This seems to be the solution for the future. As customers uncover and create these hybrid solutions, we will see how well the existing tools adapt to the changes. I am excited to see what our customers will be doing with the combination of hosted and on-premise analytics, as it seems like a promising path to achieving operational excellence in the near future.
Read the new research report, “Analytics in the Cloud: A study conducted by Enterprise Management Associates” for more details on how your peers are using cloud-based analytics.
This blog will be where members of the Statistica team--a dedicated group of trouble-shooters, project managers, subject matter experts, sales engineers and thought leaders--can freely connect with each other and with members of the broader Statistica user community.
We are excited to share our thoughts about issues and trends and products within the dynamic realm of statistics, business analytics, big data, predictive solutions, and information management. This community is also intended to facilitate your use and comprehension of the award-winning Statistica analytics platform, through the eventual sharing of blogs, webcasts, media, whitepapers, tools, solutions, and more. We appreciate your patience as we roll out relevant content here, and we recommend you visit the excellent resources already available via the Statistica product page. Meanwhile, come back and come back often to learn what’s new in the world of Statistica.
Or better yet, join the community if you’re not already a member and visit our main blog page to email subscribe to this blog. (The link is under "Options.") Feel free to give us feedback and suggestions by posting your comments on a blog post or a how-to wiki article. We want to provide tools and solutions that are practical and helpful for you.
-- The Dell Statistica Team
IT environments are becoming increasingly more diverse and complex, and consequently harder to manage. Mobility, along with more and more “smart” devices (i.e. the Internet of Things) has led to a significant increase in the number and types of devices that are connected to corporate networks, devices that you must now track, manage and secure. The scope of the problem becomes even larger when you also consider the number of different platforms that need to be managed – these include Windows, Windows Server, Mac OS X, Linux, UNIX, Chromebook, iOS and Android. Issues and concerns may include:
All of these factors are driving the imperative for you to make anypoint systems management a higher priority. If your organization previously felt that it could ‘get along’ without a comprehensive systems management strategy, you are likely now feeling the pressure to find a comprehensive “anypoint” systems management solution, one that is both easy to use and addresses all of your concerns.
The Dell KACE K1000 Systems Management Appliance (K1000) provides you with that solution. The K1000 provides comprehensive systems management for computers and servers, including discovery, inventory and asset management, software distribution, patch management, policy enforcement and service desk. And the latest release of the K1000 (available in March 2015) strengthens the focus on anypoint systems management and delivers greater visibility across your entire IT environment. New and enhanced functionality expands the types of devices managed by the K1000 to Chromebooks, Windows servers and network connected non-computer devices. We have also made improvements to the usability of a number of K1000 components. With this release, the K1000:
You can find out more about the latest enhancements to the K1000 by attending a live demo or by visiting www.kace.com or http://software.dell.com/products/kace-k1000-systems-management-appliance/.
Adopting new technologies that best fit your company’s business challenges can be a difficult process. Separating the marketing speak from true business value causes many firms to take a wait and see attitude towards cutting edge solutions creating innovation gaps and hindering competitive advantage.
Cloud solutions are an excellent example of technology that was initially slow to gain traction but today can’t be ignored and should be part of your overall business strategy. It’s difficult to find a company that hasn’t adopted cloud at some level to gain advantage in their markets. While adoption of cloud is pervasive there are companies that are jumping ahead by combining the technology with a smart strategy. These innovative decision makers have fostered “cloud first cultures” that empowers their teams to investigate cloud based solutions first over traditional on premises solutions.
By starting with cloud these companies are building more agile and responsive IT infrastructures and can more readily take advantage of cloud benefits.
Companies that support Cloud first find themselves better positioned to take advantage of newer technologies because they have invested in a more agile and flexible foundation. As their commitment to cloud grows so does there proficiency to manage and leverage these environments. A research study from Enterprise Management Associates (EMA) citing over 600 Big Data projects shows that cloud is playing a critical role in these projects. Over 30% of the projects implemented for Big Data are deploying additional Infrastructure as a service (IaaS), platform as a service (PaaS) or software as a service (SaaS) solutions to execute these new innovative workloads. Companies already successful with cloud based projects are better positioned for success and cloud is speeding their progress.
As the landscape of IT and data management grows these companies are relying on innovative vendors to support more complex projects while supplying a wide choice of solutions that are easy to manage from a systems and security, performance and configuration standpoint.
Companies adopting cloud along with a cloud first culture are finding it easier to innovate, reducing risk and moving faster than their competition.
Have a look through EMA’s research report, “Analytics in the Cloud: A study conducted by Enterprise Management Associates” for more details on how your peers are using cloud-based analytics.
A year ago, I published a blog post about running Ubuntu on the Precision M3800. Being involved in Project Sputnik and a life-long Linux user, I knew that this is a system Linux users would love, so I thought I'd make their job of getting the M3800 up and running a little easier. Since then, that blog post has seen 40,000 views, with a follow-up post to it receiving 10,000 views. Since my blog post, aside from the view count, we've received an overwhelming response from the community desiring an officially supported product. With the momentum we've built, I'm proud to say that the Precision M3800 is joining the XPS 13 Developer Edition as its official big brother. Oh, and I should note that the XPS 13 DE, now in its fourth generation, is more svelte than ever. The fourth generation XPS 13 DE will be available for sale soon.
As the M3800 is in the Precision line, it's possible for us to offer this system out of the gate with Ubuntu as a build to order offering worldwide. That means that you will have a full range of configuration options. I'd like to say thank you to the Precision product marketing team for enthusiastically helping with this launch. Note that their Precision M4800 and M6800 systems already have been orderable with Ubuntu or RHEL for some time. (Want a mobile workstation with full RAID 5 and 32GB of RAM? Dell's got you covered.)
The refresh of the M3800 now has an optional UHD "4K" display in addition to the base FHD option. That's up from the previous QHD+ display. Additionally, the latest Precision M3800 adds hardware support for Thunderbolt 2.0. Because our Ubuntu factory installs only ship Ubuntu LTS releases, we were not able to ship with official Thunderbolt support. However, thanks to the hardware-enablement stack in Ubuntu, starting with upcoming Ubuntu 14.04.2, you will be able to upgrade your kernel to add some Thunderbolt support. We plan to be working with Canonical to recertify the Precision M3800 with official Thunderbolt support.
So, do you want a system running Linux? Dell has more client and server systems certified with Ubuntu than anyone else. For me, as a lifelong Linux user, the choice is clear.
by Armando Acosta
Being a data scientist is seen as one of the most lucrative and desirable jobs of the 21st Century. But the job comes with challenges for both companies and the data scientists they hire. For a recent article in CMS WiRE, Dell's Joanna Schloss took a deeper look at the shortage of data scientists and some of the possible solutions for success.
Among her suggestions:
1. Hiring from within may be the answer -The perfect data scientist may already be on your payroll. Internal candidates know your business, your culture, and your business goals. They can bring experiences an outside candidate simply cannot, which can provide a longer-term, more successful solution. However, Schloss also warns that only the right internal candidate should be chosen, and must receive your full support. Promoting from within simply because an external candidate cannot be found may cause you to second guess their insights, potentially negating the benefits you hoped to gain by having a data scientist on staff.
2. The right tools are as important as the right candidate - To be successful and provide proper insights and value, a data scientist must have access to the right tools. One such tool is self-service data technology. With the proper technology and tools, data scientists are better able to capitalize on your company’s big data investments, and provide the valuable business understanding you expect from them.
3. ROI isn't always immediate - Data analysis is an ongoing process, and result times vary. Patience and flexibility are important, but results are often well worth the wait.
Remembering Schloss' insights just may help your company avoid some of the hype and allow you to start reaping the rewards a data scientist can provide.
Supporting hundreds of customers over 15+ years has provided me a great opportunity to help advise large organizations design monitoring solutions. A common challenge is how to standardize monitoring after acquisitions without alienating newly acquired teams and the processes/software they depend on.
Some parallel trends have been occurring which on the surface may seem unrelated, yet have a big impact on your efforts to standardize monitoring...
So what does this have to do with Monitoring? While the goal is standardization, the methods employed to get there are varied. Here's a breakdown using each of these major trends as an example:
Firstly, with greater use of virtualization, IT operations (often called "shared services" or "corporate IT") has become more efficient and standardized. They are now able to provide processing power/storage/DB/etc as a standard service regardless of the application team requiring it. Logically, monitoring requirements then standardize around the operations teams vs. the various development or QA teams which may each have different tools.
How can Dell help? Dell's Foglight solution creates a monitoring standard with fast value across virtualization, DB, Storage, and physical OS. They also integrate directly with application monitoring capability allowing both teams to collaborate.
This brings me to the acquisitions; Modern web based applications interact directly with your customers, suppliers, manufacturers, and distributors. Therefore, an acquisition is often made for the value of these application(s) as well as the intellectual value of its employees. There's a difficult balance here... After acquisition we don't want to disrupt the processes and software these teams have depended upon. However, we DO want to provide value with the full org's standard operational services and software.
How can Dell help? Initially, costs are cut quickly by deploying the standardized monitoring (and IT services) to the new organization. In cases where the new team's application level monitoring is lacking, Foglight provides deep application monitoring capabilities integrated directly to the operational level monitoring. Over time, existing application monitoring solutions can be phased out by standardizing into Foglight wherever capabilities overlap.
Security for Cloud and On-Premise services:
The next trend is the need for increased security; especially when the focus is Application Monitoring (APM). APM is a wonderful capability which monitors and verifies ACTUAL user response, transactions, and data processing, however this capability requires absolute security. Raising some concerns is a recent explosion of cloud hosted monitoring solutions offering minimal administration and fast deployments. These deployments however introduce a valid security concern by monitoring and even storing secure user data outside your control at the cloud provider. While this may have been appropriate for the applications of the smaller company prior to acquisition, it is often not compliant for the larger organization's security policies. In addition to this, a myriad of on-site application monitors becomes a major challenge to audit and assure security compliance.
How can Dell help? Foglight offers both on-premise AND cloud based application performance monitoring capability to give you flexibility of standardizing on a single monitoring platform while still reaping the benefits of both options where appropriate.
The final trend leading to monitoring standards is the use of Dev/Ops methodologies. If not familiar with the Dev/Ops concept, think of it as a methodology which helps break down IT barriers between Development, QA, and Operations. They are given greater collaboration while development maintains much of the flexibility of being autonomous (sounds great for acquisitions right?!) During Acquisitions, this methodology can be very helpful in that new application teams can operate much as they have in the past along with their own operations support; however they also collaborate and share resources with corporate IT operations. A rising challenge with this methodology is how each different application group will often have its own IT budget and therefore different monitoring tools to manage and maintain. These tools are often very targeted and highly valued by each team although they will typically not integrate well into corporate standard monitoring.
How can Dell help? In order to minimize the impact to critical application development, I typically recommend a slower controlled migration of dev/ops teams into corporate standard monitoring. This way, each team can initially continue to operate as it always has and the impact of monitoring change can occur when appropriate over time. Typically I've seen collaboration with other dev/ops teams will often lead to an organic adoption of Foglight as these teams desire greater monitoring capability and integration with the rest of the org.
The world of IT is in constant flux with non-stop changes and challenges. Using Foglight, we’ve seen organizations be very successful in cutting costs by standardizing “shared services” monitoring while maintaining flexibility for individual application teams to stay agile. Over time, these teams will be able to organically migrate into the corporate standard by deploying integrated application monitoring. These deployments also come with the advantage of a higher level of security compliance by streamlining solutions and minimizing monitoring data stored in externally hosted cloud.