Our community is talking about the new Dell Technologies. Join the discussion in the Dell EMC Community Network:
The democratization of HPC is under way. Removing the complexities traditionally associated with HPC, and focusing on making insightful data more easily accessible to a company’s users are the lynchpins to greater adoption of high performance computing for organizations beyond the more traditional groups.
HPC is no longer simply about crunching information. The science has evolved to include predicting and developing actionable insights. That is where the smaller, newer adopters uncover the true value of HPC.
However, these organizations can become overwhelmed by the amount, size, and types of information they’re collecting, storing, and analyzing. Increasingly, these enterprises are identifying HPC as an efficient and cost effective solution to quickly glean valuable insights from their big data applications.
That cost-effective efficiency can yield impressive measureable results. In just one example, Onur Celebioglu, Dell’s director of HPC & SAP HANA solutions, Engineered Solutions and Cloud, cited how HPC has allowed life sciences using big data to slash genetic sequencing from four days to just four hours per patient. That reduction has provided an untold improvement in treatment plans, which has bettered the lives of patients and their families.
Greater democratization also occurs when companies realize it is possible to leverage HPC, cloud, and big data to benefit their business without abandoning their existing systems. Having the ability to build onto an existing system as business needs warrant, allows more organizations that otherwise couldn’t reap the benefits of HPC to do so.
You can read more about the democratization of HPC at EnterpriseTech.
You can read more about the democratization of HPC at EnterpriseTech.
I'm pleased to announce the availability of .NET monitoring with the Foglight APM SaaS Edition.
The .NET agent includes full support for major Foglight APM features such as:
In addition, the .NET agent also monitors the health of your IIS server.
We're pretty excited about it, we hope you'll try it at www.foglight.com/trials.
In earlier blogs, I’ve explained how important it is for systems management solutions to save educational institutions time and money and enhance student learning by keeping devices secure and available. But what about the systems management solutions themselves? With limited IT staff and budget, educational institutions need tools that are easy to deploy and use and that will continue to deliver value as the institution grows — without requiring increased headcount.
Florida’s Seminole County Public Schools, for example, was very concerned about ease of installation and maintenance when it began looking for a comprehensive systems management solution. Some vendors, the district found, proposed solutions that would have required IT staff to install, configure and maintain multiple servers. Moreover, some products had multiple components that needed to be integrated, making the solution much tougher to deploy district-wide. These choices were simply too complex and expensive to maintain, the district decided.
These sentiments are echoed by many other schools and colleges, including the San Bernardino County Superintendent of Schools (SBCSS) in California, which was looking for an integrated solution to replace the seven different products it was using to perform inventory, imaging and remote system management. To support 33 school districts across 22,000 square miles, SBCSS needed to be able to install images remotely and with as little manual work as possible, as well as identify and remove malware and unauthorized software when affected machines join the network. Ease of use and automation in systems management, the district knew, were critical to supporting its educational mission and growing digital curriculum.
More broadly, educational institutions also need the flexibility to implement systems management in a way that best fits their environment — physical, virtual or in the cloud. They also need a simple plug-and-play architecture that virtually eliminates installation and maintenance, along with support for a broad range of operating systems and applications.
To learn about how organizations like yours have discovered and implemented systems management solutions that are designed to be both immediately productive and trouble-free for the long term, be sure to read our new solution brief.
About Lolita Chandra
Lolita is a Product Marketing Manager for Dell KACE. She has over 10 years of product marketing experience with IT software and infrastructure-as-a-service solutions.
View all posts by Lolita Chandra
by Armando Acosta
Big data has become an increasingly vital component of the business decision process. That was very evident at this year’s Hadoop Summit in San Jose, California. With training, three full days of content, over 125 speakers, exciting keynotes, and some 4,000 participants, it provided a plethora of valuable information.
During the conference, Dell and Syncsort made a pre-announcement of the Dell | Cloudera | Syncsort Data Warehouse Optimization – ETL Offload Reference Architecture, which will launch next month. This new RA for ETL offload allows companies to reduce Hadoop deployment times, develop ETL jobs within hours, and become fully productive within days. In turn it can lead to lower data transformation costs and can provide operational efficiencies that lay a strong, cost-effective, secure and scalable foundation for managing data on an ongoing basis.
As part of the pre-announcement, we sponsored a well-attended panel presentation with these partners to discuss this industry first and only solution. We’ll be posting a link to the panel presentation soon. Additionally, theCube hosted two live broadcasts to talk about the pre-announcement and details of the solution. You can view both of theCube interviews below.
As always, the Hadoop Summit reminded participants of just how exciting the world of Big Data has become. It is extremely rewarding to be a part of pivotal new developments that advance the industry for everyone’s benefit.
Promising students in South Africa will now have an exciting new opportunity to obtain greater, more in-depth experiences in high performance computing (HPC). A partnership between the South African Department of Trade and Industry (DTI), the Center for High Performance Computing (CHPC), and Dell Computers has resulted in a new IT academy.
Slated to open in January of 2016, each year the Khulisa IT Academy will play host to promising students from economically disadvantaged areas throughout the country. "Khulisa" translates as "nurturing" in the isiZulu language.
The purpose of the academy is to grow the skill set and experience of young South Africans pursuing careers in HPC. During their two-year terms at the academy, students will be able to marry the theoretical aspects of HPC they have learned in the classroom with real-life, practical experiences offered through various industry internships.
To allow the students to concentrate on their education and future professions, each will receive a stipend for the duration of their time at the academy. Upon graduation, these rising HPC stars will be ready to enter into careers in any number of industries.
Dell is honored to be able to play a small role in helping these worthy students. The company is investing financially in the academy, as well as offering startup funding for the ventures of students with proven entrepreneurial skills.
Priceline.com tracks more than 600,000 properties. It receives more than 10 million unique visitors per month. Each spends more than 5 minutes per visit on average. The site ranks in the top 200 of U.S.-based websites.
Now, that’s a busy database environment.
Priceline.com’s IT strategy depends on increasing uptime and stamping out downtime. If a company like that can implement replication of its Oracle databases without interrupting service to its customers, then so can you.
Think of the thousands, millions or billions of records in your production database. They’re your company’s gold mine, but only when people are constantly using them.
With all that demand, you’d better have a copy of the production database running on an alternate system. What if all those departments started working on product at the same time? You’d have huge performance problems, and then nobody would be happy, least of all your prospects and paying customers. Worse yet, if you needed to take prod down for maintenance, or if it failed, then you’d have big availability problems.
That’s what data replication is for. Replication is more than just a backup or snapshot of the database at a specific point in time. Customers, Marketing and Operations don’t want to work with stale data, so the goal of replication is to maintain a duplicate as close to the real-time data as possible without affecting prod and resources.
Priceline.com uses data replication for high performance and high availability on its production databases to avoid these three issues:
1. Outages and downtime during upgrades
With all the layers under priceline.com — applications, servers, databases — there is plenty of upgrading, updating and patching going on all the time. The company maintains multiple real-time replicas of its Oracle databases so that it can switch applications among them, perform rolling upgrades to new database versions, update hardware or software on servers and even remove systems altogether, yet avoid interrupting service to its customers. This allows Priceline.com to maintain close to 100 percent continuous Oracle database availability, making outages due to database issues virtually non-existent.
Without replicas of the production database, Priceline.com would have to schedule customer-facing downtime during upgrades. That would be a big competitive disadvantage, not to mention lost revenue.
2. Reporting affecting performance
Simple reporting may not seem burdensome, but the wrong string of SQL queries suddenly thrown at prod can sorely affect performance. As a result of Priceline.com’s replication strategy, business users’ queries run against near-real-time records on secondary servers, offloading away from prod for customer transactions.
3. Migration between computing platforms
Linux much? Priceline.com does. Over a couple of years, the company moved a dozen nodes and the Oracle databases running on them from Sun SPARC to Linux on Intel. IT moved one server at a time without any outages, with data replication mitigating all the performance and availability riding on a project like that.
To learn more about how Priceline.com maintains nearly 100 percent uptime, have a look at our case study “Online travel company keeps its website humming 24/7/365.”
Considering how many problems a data replication strategy can solve, your environment can never be too busy to figure out your own strategy. However, you can be too busy to execute it if you’re using tools that are cumbersome to use and replicate inefficiently and thus end up burdening your resources and production server.
Read this technical brief, “Ensuring High Availability for Critical Systems and Applications” to find out how the right tool set can help your organization ensure continuous uptime of Oracle databases to improve overall system availability. Topics include offloading reporting, hardware and software changes, Oracle platform migrations, and high availability and disaster recovery strategies.
In the wake of the recent OPM cyber breach, federal CIO Tony Scott recently announced a 30-day “Cybersecurity Sprint” requiring agencies to immediately take steps to improve protection of federal information and resilience of federal networks.
Tony Scott’s initiative comes following the latest battles in the ongoing cyberwar against the United States government and an alarming increase in cyber threats. In fact, a February 2015 report issued by the U.S. Government Accountability Office (GAO) found that over the past eight years, incidents reported by federal agencies to the U.S. Computer Emergency Readiness Team (U.S. CERT) have increased by 1,121 percent, reflecting 67,000 reported incidents in 2014.
The use of the word “sprint” signifies that the CIO is utilizing a methodology designed to deliver results fast. At the same time, the Cyber Sprint encompasses a wide range of critical cybersecurity elements, recognizing the need for holistic security and an active, rather than reactive, security posture. This presents agencies with a significant challenge, but one that they have the resources to address.
Within the confines of the Cyber Sprint, agencies must address four critical security efforts:
Immediately deploy indicators provided by DHS regarding priority threat-actor Techniques, Tactics, and Procedures to scan systems and check logs
As a part of the Cyber Sprint, agencies will now be required to immediately report any evidence of malicious cyber activity. Real time reporting is essential for quick remediation of cyber incidents. Luckily, today’s next-gen firewalls, coupled with insight into abnormal network activity enabled by robust identity and access management (IAM) approaches make these capabilities possible and give agencies a head start on their sprint. Dell SonicWALL offers next-gen firewalls that can correlate and present data from servers, network switches and firewalls.
Patch critical vulnerabilities without delay
Cyber criminals often have advanced resources available for cyber exploits, yet the vast majority of cyber intrusions take advantage of easily identifiable – and easily remediated – vulnerabilities. With the right tools in place, this is a simple element of the Cyber Sprint. Dell can identify and deploy patches for endpoints and servers and also provide updated virus signatures and deep packet inspection through next generation firewalls. Dell’s KACE systems management appliances enable rapid and effective patch management across heterogeneous enterprises of all sizes.
Tighten policies and practices for privileged users
Privileged users often hold the keys to the kingdom when it comes to sensitive government data. The Cyber Sprint seeks to mitigate this potential threat by limiting and controlling privileged user access. Additionally, Tony Scott has stressed the importance of tightening policies for privileged users. Privileged account management tools can help tighten these policies without prohibiting necessary access. Dell’s privileged account management offerings allow agencies to control the resources available through privileged accounts, while also controlling, monitoring and producing reports on the activities of these individuals. Dell is the only vendor that offers solutions in each area detailed by Gartner in its Privileged Account Management Market Guide.
Dramatically accelerate implementation of multi-factor authentication, especially for privileged users
Internal threats have been recognized as a critical security concern, often providing intruders with easy access to sensitive data. Multi-factor authentication provides an additional line of defense against external bad actors posing as qualified insiders – one that has been mandated by government for the past decade through Homeland Security Presidential Directive-12 (HSPD-12). Dell can provide hardware and software tokens for multi-factor authentication and help agencies integrate existing multi-factor authorization infrastructures with modern as well as legacy applications. Dell’s Defender multifactor authentication solution requires no dedicated server and can authenticate against already-in-place Active Directory infrastructure, facilitating this step for agencies in a hurry to get to the finish line.
Get on your mark and get ready for the sprint - Dell stands ready to help federal agencies achieve the cybersecurity improvements with which they’re tasked. Learn more about Dell’s end-to-end security offerings here: http://software.dell.com/solutions/security/.
Dell also offers end-to-end solutions to address the NIST Cyber Framework. To learn more please visit: http://software.dell.com/nistframework/.
I’m often asked how the new age of big data will impact small- and mid-sized business (SMBs). Can they keep up? Can they stay relevant in the age of big data? The answer is a resounding yes. After all, big data really represents all data, regardless of what it looks like or where it resides. We’re talking social media, internet, IoT, images, and digital media, and as well as all those forms of structured data we’re quick to forget about but are still dominating the data management landscape. In no uncertain terms, the SMB is just as interested – if not more interested - in taking advantage of this pool of data. Just like their enterprise counterparts, SMBs are actively searching for the value and business opportunity hidden within data.
In many ways, the new data landscape is completely altering the old competitive landscape as it pertains to SMBs and enterprises. The growth of so-called big data hasn’t made SMBs less competitive or less innovative. In fact, it’s done just the opposite. Thanks to advancements in our ability to capture and analyze data, SMBs can now drive innovation in ways previously reserved for the enterprise. Managing “all data” gives the business, regardless of their size or budget, the ability to better understand their customers, their businesses, and their marketplaces. All of which means that in this new data ecosystem, SMBs are more competitive with bigger, richer enterprises than they’ve ever been in the past.
Which bring us to Microsoft. Microsoft, like Dell, has long been known as champion of the middle market, and, again like Dell, they’re clearly focused on taking that commitment to the next level amid the changing data landscape. You can already see how customers’ need to corral big data is impacting the way Microsoft supports the SMB ecosystem. MSFT has invested in morphing and creating many products to reflect the need to support the evolving needs of the SMB. A great example of this is the company’s aggressive investment in Azure. Other examples of Microsoft investing heavily to meet the changing needs the SMBs can be seen in the company’s Excel and SharePoint brands.
To understand that this all means for Dell (spoiler: this is a great thing for Dell), let’s look more closely at Azure. Azure opens up a cloud-based approach to big data storage, delivering opportunities for the SMB to enjoy a pay-as-you-go approach. The Azure platform opens up a means for SMBs to experience small footprints of the big data ecosystem without committing precious resources to these efforts. These smaller chunks of the data are also more representative of what an SMB might need to store and archive. After leveraging Azure storage, Azure ML allows customer to experiment with big data using machine learning, essentially setting up an analytic sand box for the organization to explore and experiment with analytic capabilities in a meaningful and “by design” method. Here’s where Dell comes into the mix. Dell Statistica allows SMBs to easily leverage its predictive power in tandem with Azure ML’s compute power to build a best-in-breed solution to address advanced analytics.
The combination of Azure and Statistica provides just one great example of Microsoft and Dell technology working together to benefit SMBS. The potential for SMBs to leverage Dell technologies to deliver value on top of their Microsoft investment is virtually limitless, and it’s one of the primary characteristics differentiating Dell in the information management space. Not only is our ability to help you get the most out of your Microsoft investments unique, but so is our willingness. For SMBs and enterprises alike, Dell is the platform-agnostic vendor. Our relationship with Microsoft is stronger than ever, but when we say all data, we mean it. So, whatever investments SMBs make and whatever path they travel down in order to get more out of their data, we can go down it with them. That’s what all data is all about.
See how data growth and new technologies are affecting the DBA ― read the eye-opening study today.
You never know what you will learn from the helpful Statistica newsletter. (Subscribe for free.) Our recent June issue brought to our readers’ attention that legacy StatSoft’s social media properties have been changed up quite a bit. You won’t find us by the same name anywhere anymore!
Well, that’s not entirely true, but readers did learn that our old Facebook page has now expanded to include all our fellow software teammates in Dell’s Information Management Group. So, now when you go visit our page, you will find our Statistica content mixed with that of Dell Boomi, SharePlex, and Toad. It’s like we suddenly discovered an extended family with whom we share much in common—primarily we share the successful end-to-end workflow of YOUR data. We think you should stop by and get to know these family members, too, and then “like” them the way you’ve always liked Statistica.
Find all our new social media links in the June newsletter >