Our community is talking about the new Dell Technologies. Join the discussion in the Dell EMC Community Network:
Join us on Wednesday, November 5 at 3:00 p.m. to hear how to align Big Data initiatives to business goals, share data by systems in a cost-effective, well-managed way, while quickly and securely making insights available to business teams across your organization for analysis, discovery, and ultimately, organizational progress.
Dell's Teresa De Onis and Uday Tekumalla will lead this discussion, together with Karim Lokas of Omneo. You'll learn:
Make data the life blood of your organization - join us for this information session Wednesday, November 5 at 3 - 4 p.m. in Room 17A.
Join us to learn how Omneo is enabling their customers to mine and uncover insights from data using the Dell│Cloudera Hadoop Solution. Omneo's Rami Lokas will join Rob Johnson of Cloudera and Joey Jablonski, a Dell Big Data Enterprise technologist, to discuss how the information gained from data can drive improved quality control as a result of a 360-degree view of product quality and performance across the supply chain, impact the bottom line with millions of dollars in savings based on the ability to identify and address supply chain issues in close to near real-time, ensure flexibility to support various workloads and users, and enable interactive data analytics to gain significant advantages.
During this presentation you will:
To accommodate the expected crowds, we've scheduled this presentation twice!
Wednesday, November 5 at 12:00 - 1:00 p.m. in Room 12B AND Thursday, November 6 at 8:30 - 9:30 a.m. in Room 13A.
Get the most out of your data - join us at the Dell World “Driving Insights from Data with Dell │ Cloudera Big Data Hadoop” presentation.
We’re excited to announce that Migration Suite for SharePoint and Site Administrator for SharePoint are now Microsoft Azure Certified!
In addition to being listed on the Azure web site, Migration Suite for SharePoint and Site Administrator for SharePoint are now available in the new Microsoft Azure Marketplace, where customers can search for and deploy thousands of third-party solutions to simplify the management of their applications on Azure.
What does this mean to Dell Software customers and/or partners?
Customers and partners now have another deployment option for these two Dell Software solutions. In other words, rather than buying hardware on-premises or standing up a virtual machine to install Migration Suite or Site Admin, you can start an Azure Gallery that has our software pre-installed. When you fire up the image, you are prompted for a license key which you need to purchase through Dell Software.
For existing Azure customers who have enterprise agreements with Microsoft for their Azure consumption (i.e. compute, storage, data, networking and app services), there’s no additional fees to run the Dell Software tools in your environment. The Dell images simply go against your Microsoft contract.
About these Dell Software solutions
With Migration Suite for SharePoint, you can migrate SharePoint data, Exchange public folders and Windows file shares to a newer version of SharePoint, hosted SharePoint, or to Office 365, with minimal downtime and no data loss.
Site Administrator for SharePoint enables you to increase visibility into your environment to streamline daily management, and reduce time spent when larger jobs are needed.
The K1000 Agentless technology, which was introduced with version 6.0 has limitless possibilities. Agentless is most commonly used for inventorying devices that we might not otherwise have an agent installed on. This might be a mission critical server that you want to keep safe from accidental updates, or it might be an operating system that an agent doesn’t exist for.
The new Agentless functionality enables K1000 appliance administrators to add these devices to the inventory and poll them periodically, keeping the inventory record updated and relevant. Here are great articles that will help you get started scanning these two common scenarios:
The concepts and methods detailed in these articles can, of course, be used against other similar types of OSes. Think of what you have that’s not in your appliance- now you can have it inventoried if you want!
by Jimmy Pike
High performance computing can offer an organization immeasurable value by cutting costs, reducing the time to market, enabling life-changing discovery, or any number of other quantifiable variables. However, there still exists a very real disconnect between the value realized through HPC, and an ability to justify the expenditure.
According to a recent survey conducted by Intersect360 for the Council on Competitiveness, some 3/4 of companies asked admitted that HPC was critical to their organization's competitiveness. Yet approximately 2/3 also indicated they face challenges in justifying the expenditure to some degree, with one in ten declaring it is a significant challenge.
From the survey we can ascertain two primary challenges companies face when considering HPC: price and Return on Investment (ROI.) Although the price point can be a hindrance - especially for smaller organizations - it is the difficulty in clearly showing ROI that can make the cost such a difficult obstacle to overcome. After all, HPC doesn't always allow for a prediction that $X spent will yield $Y returned.
So, what can be done? Well, in the long term solutions like increased scalability and greater government investment will help bridge the divide between need and expenditure. However, we also have a duty right now to help educate the key decision makers about the ROI available through HPC.
For example, recently I spoke with some industry leaders, who admitted that without supercomputing it can take 3-5 years to get their product safely completed. However, the decision makers are unwilling to make the needed HPC investment to safely and successfully reduce that time frame. My response to that statement was simple: What do your companies have to lose? You can build, test, build and repeat, or simulate and test. The latter provides significant ROI.
Additionally, there is a growing demand for iterative research. Rather than running a batch, stopping it, making the change and running it again, there is now an emerging ability in some environments to change variables along the way without running a new batch. That ability can prove to be invaluable.
Finally, for smaller companies, there are some options to "test run" software to see how costly any required licenses will be. The National Center for Supercomputing Applications (NCSA) at the University of Illinois, Urbana-Champaign, for example, allows companies to use its iForge cluster to better understand what performance and scalability gains will be realized under various conditions. (You can read more about the program in an earlier blog.)
Ultimately, it's up to us to help industries better understand and explain the myriad benefits that come with high performance computing. Because when organizations recognize that the ROI is well worth the expenditure, everyone benefits.
If you're interested in the full methodology, comments and results from the Intersect360 / Council on Competitiveness survey, you can access it here.
Here are the top 10 reasons NOT to build a Halloween Dashboard with Foglight:
#10 - I don't know how to drag and drop. #9 - I can't find a good background image. #8 - I don't like dashboards, I miss the green terminal.#7 - Last time I built a dashboard I ate the Jelly beans. #6 - If I built a fun dashboard everybody will want to use it. #5 - My mouse ate my monitor. #4 - Last time I looked at the spinners I was hypnotized.#3 - I don't like fun. #2 - I use TOAD and I thought it was called Frog-lite.#1 - I am too scared of skulls and pumpkins.
Since there really is no excuse NOT to build a fun Halloween Dashboard with Foglight, I hope you find this video fun and useful.
Happy Halloween everybody!
You want to run your business on data and deliver results right now. Who doesn’t? But to achieve these goals, you need to bring together data from disparate sources and perform in-depth analysis. How can you achieve agile data integration while helping to ensure the quality and consistency of data at the same time?
In this Dell on-demand webcast, Philip Russom, research director on data management for the research firm TDWI, outlines three pillars for agile data integration. By building on these pillars, organizations can deliver data integration solutions sooner, better align solutions with business goals and ultimately free up resources to develop more solutions.
Pillar 1. Enable self-service data integration
The process of gathering requirements for data integration can be time-consuming, but according to Russom, it doesn’t have to be. Providing technical and business teams with self-service tools that incorporate data profiling, data discovery and data visualization capabilities can accelerate the process. Those tools help teams record requirements as they work, helping to eliminate the weeks or months of interviewing various stakeholders.
Pillar 2. Capitalize on rapid data set prototyping
Creating data set prototypes early in the data integration allows you to sustain high data quality and avoid issues down the road. Fortunately, many self-service tools enable rapid prototyping of data sets. Technical and business team members can conduct simple data extractions and transformations to produce prototypes quickly and easily.
You can achieve agile development and delivery of data integration solutions while also addressing responsible data access and preparation requirements that help ensure the quality and consistency of data. Pillar 3. Employ data stewardship and facilitate collaboration
Data stewardship plays an important role in successful data integration. A data steward is a member of a business group who helps ensure data management efforts meet business requirements and who can deliver a rapid return on investment. When data stewards collaborate with technical staff on data integration projects, organizations can better align technical work with business requirements. The result is faster development of data integration solutions and fewer overlapping tasks that can delay project completion.
Having the right tools can make it easier for organizations to build on these pillars. In the same webcast, Peter Evans, a business intelligence and analytics product evangelist at Dell, highlights Dell software for information management that can help organizations take advantage of these pillars and achieve successful, agile data integration.
Ready to learn more? View the data integration webcast.
I am happy to announce that Dell Software has published a new tech brief titled Migrating Your IBM Notes Document Libraries to SharePoint.
As your organization plans its migration from IBM Notes to SharePoint, you might expect that migrating your document libraries will be the simplest task. After all, both platforms offer document libraries — shouldn’t you simply migrate each Notes document library to a corresponding SharePoint document library?
The truth is, Notes and SharePoint document libraries are more like distant cousins than identical twins, so document library migration is not that simple. Notes document libraries have a number of features that are not supported in SharePoint document libraries, so a direct migration is not usually the best choice for your business and your users. Fortunately, SharePoint offers a variety of targets for your Notes document libraries — targets that offer the features and functionality you need after the migration. You simply need to understand your options and take some time to plan.
This tech brief details the specific differences between Notes and SharePoint document libraries that affect migration, explains the options you have for SharePoint migration targets, and provides recommendations about the when each option is most appropriate. Then it steps you through the process of completing each type of migration quickly and easily using Migrator for Notes to SharePoint.
To download this tech brief, visit http://www.dellsoftware.com/assets/71539/.
To learn more about Dell’s solutions for migrating from Notes and Domino, visit http://software.dell.com/platforms/lotus-notes.
To download a trial copy of Migrator for Notes to SharePoint, visit http://software.dell.com/register/54995/.
by Brandon Draeger
As the adoption of Apache Hadoop in the enterprise gains momentum, the needs of our customers are expanding and evolving at an even more rapid pace. In recent years, the majority of the focus of customers has been purely on proof of concept type deployments to “kick the tires” on Hadoop to understand where it will fit within a data management strategy. As customers complete those evaluations, new solution options aligned with the next phases of their big data journey are needed.
Through the 3-way partnership between Dell-Cloudera-Intel, those needs are being met in a very holistic way. At the beginning of the life cycle, customers require a quick path to getting beyond setup to evaluation phase, which is addressed by the Dell QuickStart for Cloudera Hadoop offering. As customers learn from their POC’s, they can then adopt and expand their operational footprint with confidence by leveraging the Dell | Cloudera Apache Hadoop Solution Reference Architectures. Finally, as advanced customers begin to look at next generation technologies such as Apache Spark, they can begin their POC cycle again with the turn-key capabilities of the Dell In-Memory Appliance for Cloudera Enterprise.
These proof points in our joint partnership among Dell, Cloudera, and Intel are just the beginning of the benefits our customers will be able realize as they continue their adoption of Hadoop. For example the release of Cloudera CDH 5.2, a major milestone in the Intel-Cloudera joint road map, delivers key security and performance features to align with a broader set of enterprise use cases. Furthermore, this release also delivers the majority of the key features first introduced in the Intel Distribution for Hadoop (IDH) that are now a part of the Cloudera Distribution where it further paves the way for joint Dell IDH customer migrations to Cloudera.
The future is bright for upcoming Dell-Cloudera-Intel collaboration into 2015 and beyond as we continue to jointly enable big data adoption for more and more enterprise customers. We will achieve this promising future by delivering integrated Dell-Cloudera-Intel solutions, targeted and optimized for key workloads from the silicon to software.
Have you ever been building a custom field in your service desk and though- “I wish I could use a CSV list for this…” You CAN! So, you already know you can import a CSV into the assets tables; what you may not know is that with a simple command and a SQL select statement you can leverage other tables, like assets, in your service desk customer fields.
So picture this scenario- you’ve got a need to support every time zone or zip code, or something else really time consuming to type in. You find the perfect list on your network, or online. You get it cleaned up for CSV import if necessary (but it wouldn’t be because it was perfect, right?!). Create an Asset type to hold all these wonderful tidbits. Import the CSV to the asset type. Use a custom query to populate your single or multi-select field types. More on that final step here: http://www.kace.com/support/resources/kb/solutiondetail?sol=SOL114937
Now apply that same concept to other types of data that exist- maybe you have a cellular phone support queue, and all sorts of Asset data about your phones. Or maybe you want to build a field that allows users to submit tickets on behalf of others- you could query other tables beyond assets! As always- test your SQL in a safe place before implementing into a production environment; Bad SQL can cause all sorts of havoc.