Dell Community

Blog Group Posts
Application Performance Monitoring Blog Foglight APM 105
Blueprint for HPC - Blog Blueprint for High Performance Computing 0
CommAutoTestGroup - Blog CommAutoTestGroup 1
Custom Solutions Engineering Blog Custom Solutions Engineering 6
Data Security Data Security 8
Dell Big Data - Blog Dell Big Data 68
Dell Cloud Blog Cloud 42
Dell Cloud OpenStack Solutions - Blog Dell Cloud OpenStack Solutions 0
Dell Lifecycle Controller Integration for SCVMM - Blog Dell Lifecycle Controller Integration for SCVMM 0
Dell Premier - Blog Dell Premier 3
Dell TechCenter TechCenter 1,853
Desktop Authority Desktop Authority 25
Featured Content - Blog Featured Content 0
Foglight for Databases Foglight for Databases 35
Foglight for Virtualization and Storage Management Virtualization Infrastructure Management 256
General HPC High Performance Computing 226
High Performance Computing - Blog High Performance Computing 35
Hotfixes vWorkspace 57
HPC Community Blogs High Performance Computing 27
HPC GPU Computing High Performance Computing 18
HPC Power and Cooling High Performance Computing 4
HPC Storage and File Systems High Performance Computing 21
Information Management Welcome to the Dell Software Information Management blog! Our top experts discuss big data, predictive analytics, database management, data replication, and more. Information Management 229
KACE Blog KACE 143
Life Sciences High Performance Computing 6
OMIMSSC - Blogs OMIMSSC 0
On Demand Services Dell On-Demand 3
Open Networking: The Whale that swallowed SDN TechCenter 0
Product Releases vWorkspace 13
Security - Blog Security 3
SharePoint for All SharePoint for All 388
Statistica Statistica 24
Systems Developed by and for Developers Dell Big Data 1
TechCenter News TechCenter Extras 47
The NFV Cloud Community Blog The NFV Cloud Community 0
Thought Leadership Service Provider Solutions 0
vWorkspace - Blog vWorkspace 510
Windows 10 IoT Enterprise (WIE10) - Blog Wyse Thin Clients running Windows 10 IoT Enterprise Windows 10 IoT Enterprise (WIE10) 3
Latest Blog Posts
  • Foglight for Virtualization and Storage Management

    Getting Back to the Promise of Virtualization

    A virtualized environment is everything your organization hoped for and more, right? Almost unlimited resources, easy spin-up/spin-down of virtual servers and a big cost-savings on hardware. When industry experts, analysts and even your colleagues proclaimed it to be true, you quickly charted your course and set sail on the virtual sea.  

    So why hasn’t your ROI materialized? Why isn’t your virtualization strategy panning out the way you expected? Why is your organization experiencing spikes in both operating expenses (OpEx) and capital expenses (CapEx)?

    After speaking with hundreds of customers, we’ve concluded that the answer to these questions rests with one simple fact:

    It’s so easy to create and distribute virtual machines (VMs) that companies have become complacent about the need to properly manage them.  

    As part of this three-blog series, we’ll explain the key concepts you need to optimize virtualization management within your organization.

    VM Density

    While you can create and run dozens of VMs per physical server, keep in mind that even though the workloads are virtual, the server and storage resources are not.

    VM density is an important metric because you trade it off against optimal performance. If your density is too high, then your VMs are competing for precious resources, which can lead to poor performance. While it can be a problem, it’s easier to find than low density.

    If your density is too low, then you’re underutilizing your physical resources. The most common symptom is excellent performance, which makes it look as though there’s no problem at all. However, once you edge back toward high density that’s when you begin to see performance problems. Striking the balance is what optimizing virtualization management is all about.

    VM size is also a factor in density. Creating immense VMs leads to inefficient sharing of disk space, physical memory and CPU.

    VM Sprawl

    Physical sprawl is pretty easy to spot. While rows of servers take up a lot of real estate and cost a lot of money, VM sprawl is less conspicuous and can be more difficult to identify. Almost every virtual data center exhibits some symptoms of VM sprawl:

    • Abandoned VM images—They’re no longer in the inventory, but they’re still in your virtualization environment. Even though they’re invisible, they consume resources.
    • Powered-off VMs—Nobody has started them up for months (years?), yet they still take up physical disk space.
    • Unused template images—If your organization has published templates for creating VMs, are you sure anyone uses them anymore?
    • Snapshots—VM snapshots that go a long time without modification are also disk hogs. If they’ve outlived their usefulness, then they’re candidates for deletion or archiving to inexpensive storage.
    • Zombie VMs—Self-service provisioning is a great idea, until it isn’t. Most users assume there is no cost associated with spinning up a few VMs (and then forgetting about them), but zombie VMs consume resources needlessly.

    VM sprawl leads to wasted storage and computing resources in the virtual environment. It’s inconspicuous at first, but it doesn’t remain that way for long and it eventually affects your performance and OpEx.

    Read the E-book: An Expert's Guide to Optimizing Virtualization Management

    We’re just getting warmed up. In the next two blogs in this series, we’ll take a look at optimizing virtualization management. Meanwhile, download our guidebook, An Expert's Guide to Optimizing Virtualization Management. It highlights five important areas many companies overlook in their virtual landscape where they can usually recover lost ROI and regain the promise of virtualization.

    John Maxwell

    About John Maxwell

    John Maxwell leads the Product Management team for Foglight at Dell Software. Outside of work he likes to hike, bike, and try new restaurants.

    View all posts by John Maxwell | Twitter

  • Dell TechCenter

    Need $50 for Holiday Shopping? Announcing the #ExpectMoreContest

    Finding Work / Life Balance

    What did your workday look like? You’d wake up every morning, commute to the office and start solving problems. Some days, the problems couldn’t wait – 6AM conference calls, email bombardments, systems going down - all before you hit the snooze button. Then, in the evenings, you’d do your best to leave the office at a reasonable hour. Which -  you and I both know - didn’t always happen.

    But thanks to Dell Software, your workday is now more productive and ends when it should. We get it: work-life balance matters. And when you work more efficiently, you have the time to pursue your passions and do the things you love. In fact, this infographic shows you how!

    And now, we want to hear from you and see your smiling faces! So, we’re launching the #ExpectMore Photo Contest,* giving you the chance to show us what you do with the time Dell Software saves you at work.

    What Is Your Passion?

    Do you use Toad at work to tackle complex databases, then, go home in the evening and fly the drone you and your son built in the workshop?

    Do you optimize your VMware storage infrastructure with Foglight for Storage Management, then coach your daughter’s soccer club to victory on the weekend?

    Do you handle otherwise daunting email migrations with Migration Manager for Exchange, then mountain bike through the trails of your local state park?

    How to Enter

    Entry into the Contest is simple.

    1. Read the Official Contest Rules http://dell.to/1kScbOv
    2. Take a picture of yourself pursuing your passion! (work appropriate, of course)
    3. Follow @DellSoftware on Twitter
    4. Tweet us the picture with the #ExpectMoreContest hashtag and include the phrase ".@DellSoftware allows me the work/life balance to pursue my personal passion in life."

    Prizes

    Each Friday during the Contest Period, we will select the most creative entry. Once selected, we will announce the winner via our @DellSoftware Twitter channel. The winner must then contact us via Twitter to receive a $50 Amazon Gift Card! 

    Examples

    Stefanie quickly runs down database issues so she has time for her true passion - skateboarding!

    Kris easily finishes his endpoint management heavy lifting so he has time for his true passion - strength training!

    Good Luck!

    Expect more from your software solutions with Dell, and start doing more with your free time.

    Winners! 

    We appreciate everyone who entered, you made our decision very difficult! 

    Here are the winners:

    * NO PURCHASE NECESSARY. Legal residents of the 50 United States (D.C.) 18 years or older.  Ends 12/17/2015. To enter and for Official Rules, including prize descriptions visit http://dell.to/1kScbOv.  Void where prohibited.

  • Information Management

    #ThinkChat- Gift Giving Gets Groovy

    Follow #ThinkChat on Twitter this Friday, December 4, at 11:00 AM PST, for a live conversation exploring the impact of evolving technology on the retail experience!

    Join us for this December/Holiday month tweet up and bring your retail experiences to share with all of us.  In the last 10 years, people have had the unprecedented and historical retail, experience of evolving with the technology from google express or Amazon virtual pantry, allows shoppers to shop for groceries at home and have it delivered, to “Show-rooming”, where customers go to brick and mortar stores to feel, experience, and demo products but do all the actual buying on line.  How has the evolution of retailing affected your gift giving or buying behavior?  Are you an on-line retail consumer? Or do you still prefer to go to your local Fry’s electronics store to pick up a stocking stuffer or two?  More interestingly, where do you see this market motion evolving? How can retailers ride these waves of change?  Bring your ideas to the tweet up to share and discuss.

    Join Shawn Rogers (@ShawnRog), marketing director for Dell Statistica, and Joanna Schloss (@JoSchloss), BI and analytics evangelist in the Dell Center of Excellence, for this month's #ThinkChat as we conduct a community conversation around your thoughts and real-life experiences!

    Questions discussed on this program will include:

    1. Where do you shop first?  Online or in a store?
    2. Do you enjoy interacting with retailers who use recommendation engines and other analytic insights?
    3. Do you find recommendation engines helpful or creepy?
    4. What drives you to online shopping? Easy of use, free delivery, lower prices, wider selections?
    5. Which retailers are leading the race with superior customer service and experiences?  Who's your favorite?
    6. Has online shopping created unique buying experiences?
    7. Is data protection top of mind? How do you protect your personal data?
    8. Does the size of the vendor affect your buying decision?
    9. What are your favorite search engines for online buying?
    10. Do you see a need for 30min drone delivery or Uber delivery? Is it a game changer?

    Where: Live on Twitter – Follow Hashtag #ThinkChat to get your questions answered and participate in the conversation!

    When: December 4, at 11:00 AM PST

  • KACE Blog

    How to Save the Equivalent of One Full-Time Employee Salary with IT Systems Management

    The fundamentals of systems management have changed, so you’re faced with managing and securing a growing number of devices, a variety of operating systems and multiple types of users, in addition to your traditional systems management tasks. Despite this acceleration in the scope, complexity and speed of change in your environment, your IT budget most likely remains flat or gets reduced, requiring you to do more with less.

    Doing More with Less

    So, when one organization is able to eliminate IT overtime costs and save one full-time salary annually by automating its anypoint systems management tasks, I like to share the story with other IT pros. First, let me tell you about some of the organization’s systems management challenges. They’re probably similar to the challenges you face every day. While this organization happens to be an educational institution, endpoint management issues are the same whether your organization teaches students, saves lives or manufactures widgets.

    Westphal College of Media Arts and Design at Drexel University has an IT staff of five, a director and four technicians, who are responsible for managing 800 PC, Mac and Linux desktops, including performing manual upgrades. The technicians scrambled from machine to machine, sometimes remotely, to install updates to the operating system, browsers, plug-ins and software applications during the one-week break between academic quarters.

    No matter how quickly they worked, they were unable to deliver consistent systems management, maintenance and updates across their IT environment. For example, they had no way of ensuring that all machines were running the same version of applications, nor could they easily determine which computers were out of sync with the others.

    Challenges Ahead

    The team’s approach to remote systems management was to obtain or build installers containing the updates, and then use a variety of tools like PsExec, Active Directory and Apple Remote Desktop to deploy them across the network. Using this approach, it was impossible to report on whether the updates had been successful and on which machines.

    Needless to say, this manual approach to systems management took a toll on the team’s overtime budget, with the OT payroll inflating to 100 hours during the week between quarters. And, while the technicians were focused on deploying upgrades and patches, they didn’t have time to support the users’ other needs or address new IT initiatives.

    The Solution Became Clear

    Prompted by these inefficiencies in timing and consistency, as well as a university-wide security initiative to encrypt all computers, the director tasked his team with finding a way to replace its manual processes with an all-inclusive automated solution to anypoint systems management. After listening to their needs, a reseller recommended the Dell KACE K1000 Systems Management Appliance. The team looked at other tools, but after a brief trial, the organization purchased the K1000.

    They immediately saw that the KACE appliance addressed their biggest pain point with the software distribution, managed installations and patch management required to keep the desktops up to date and secure. The greatest time savings came with the ability to reuse their work once they loaded a patch or managed installation into the K1000. 

    They then began expanding their use of the KACE appliance. After several months of success managing installations and scripting remotely, they took a broader view and began consolidating their information systems. Using the K1000’s integrated service desk functionality, they realized flexibility they never had before as they could now create triggers, custom ticket roles and direct connections into inventory that showed all requests associated with each machine. Next, they built custom assets and email alerts in KACE to help them track loaned equipment, so they wouldn’t miss due dates.

    Cost Savings for Drexel University

    The K1000 Systems Management Appliance was quickly paying for itself. The organization eliminated overtime during break week – from 100 extra hours to finishing a day and a half early with the K1000. According to the IT director’s FTE calculation, the cost savings to his department is equivalent to the annual salary of one full-time IT pro. His department also benefits from compliance with the university’s security initiative. The KACE appliance provides automated patching as well as the reporting tools needed to show that the encryption agent is present on all 800 computers and to assist in documenting that the IT group is in full compliance.

    With the K1000, the IT staff can also offer a shorter turnaround time on break fixes. Once IT has identified the problem and verified the fix, IT can deploy it centrally in hours instead of days and make the computer available to users much more quickly than before. The IT director is also seeing the strategic benefit of the KACE appliance as it affords him a comprehensive overview of all 800 desktops.

    As this organization discovered, manual or individual point solutions no longer suffice in today’s IT environments. IT professionals must now view anypoint management as an imperative that cannot be ignored and one that needs to be addressed with an all-inclusive solution.

    Watch the Full Story

    I love to share KACE success stories, but I know you’d rather hear directly from your peers. So I’ve included a link to a 4-minute video featuring Jason Rappaport, director of IT, Antoinette Westphal College of Media Arts and Design, Drexel University, along with some members of his IT team. In the video, they detail how they were able to create a central view of their multi-platform environment, implement reporting on 800 desktops to comply with the organization’s security initiative, and speed application deployment with Dell KACE appliances.

    About Stephen Hatch

    Stephen is a Senior Product Marketing Manager for Dell KACE. He has over eight years of experience with KACE and over 20 years of marketing communications experience.

    View all posts by Stephen Hatch

  • Dell TechCenter

    Introducing Thin Import with Storage Center 6.7

    Written by Kris Piepho, Dell Storage Applications Engineering

    For customers looking to migrate data from a PS Series array to an SC Series array, Storage Center 6.7 now includes the Thin Import feature.

    Thin Import works at a block-level and uses synchronous replication to import data from PS to SC Series storage. All blocks on the source LUN are read and then written to the target volume on the SC Series arrays with the exception of zeroed blocks, which are not actually committed to disk. The result is a thin-provisioned volume on the SC Series array.

    How does Thin Import work?

    Thin Import works in one of two ways, online and offline. In online mode, a destination volume is created on the SC series array, mapped to the server and then data is migrated to the destination volume. I/O from the server continues to both the destination and source volumes during the import. Online mode can be used for importing volumes that host mission-critical applications. Offline mode simply migrates data from the source volume to a destination volume. It does not recreate the mapping on the source volume. Online imports tend to take longer than offline because I/O continues to the volume from the server.

    How long does an import take?

    This can vary depending on available bandwidth between the arrays, amount of data to be transferred and the volume workload (in online mode). Another factor that determines the import speed is the location of the destination volume. By default, the import process imports data to the lowest tier of storage. Although writing to faster disks usually means faster import times, it’s a good idea to leave this setting alone because importing directly to Tier 1 could potentially fill all available space in the tier.

    How can I do it?

    Luckily, everything you need to take care of is covered in the best practices guide. This guide includes key prerequisites for both PS Series and SC Series arrays that need to be completed before starting an import. 

    For you visual learners, be sure to watch the demo video that accompanies the best practices guide. The video is a great way to see the import process in action.

    Good luck and happy importing! 

  • Dell TechCenter

    Introducing Windows Server 2016 Nano Server Technical Preview 4


    Note: Dell does not offer support for Windows Server 2016  at this time.  Dell is actively testing and working closely with Microsoft on WS 2016, but since it is still in development, the exact hardware components/configurations that Dell will fully support are still being determined.  The information divulged in our online documents prior to Dell launching and shipping WS2016 may not directly reflect Dell supported product offerings with the final release of WS 2016. We are, however, very interested in your results/feedback/suggestions. Please send them to WinServerBlogs@dell.com


      

     

    Nano Server is the new Windows Server 2016 installation option which is positioned as a purpose-built operating system designed for cloud applications.

    • Nano is a lightweight, small footprint OS that and can be customized to run specific workloads. It is designed for compute and storage hosts in private clouds and enterprise datacenters.
    • It has a limited range of roles, features, drivers and applications that are available to be installed into the base Nano image. All are installed offline using existing Microsoft tools, such as DISM.
    • It has a smaller attack surface and fewer patches and reboots.
    • It installs and reboots much faster than a full OS, thus reducing the downtime drastically.

     

    Nano Server TP4, released on Nov 19th 2015, is available as a ‘wim’ file in the WS2016 OS media along with the roles and drivers that can added to it.


    Fig. 1. Inside Nano Server folder on Windows Server 2016 TP4

     

    There are PowerShell deployment scripts available to help automate the VHD image creation process. This VHD can be then used to deploy Nano as a virtual machine on Hyper-V, or for bare-metal OS installation.

    Unlike any Windows OS before, Nano has a NO GUI and limited local interaction with the OS. Nano is purpose-built to be managed and maintained via a remote management station. Once you login to the OS, you see a ‘Nano Server Recovery Console’ which provides general information about the installed OS and displays the NIC IP addresses. The two configuration options include Networking and Firewall.


    Fig. 2. Nano Login on Dell iDRAC console for Dell PowerEdge R730 XD

     


     

    Fig. 3. Nano Server Recovery Console on Dell iDRAC console for Dell PowerEdge R730 XD


    The primary method to manage Nano Server is Windows PowerShell remoting. You need to setup PowerShell sessions to the Nano OS and run commands over the network. Dell iDRAC provides a great way to monitor and manage the Nano Server. We will look at this functionality in detail along with ways to create, configure and deploy Nano Server on Dell PowerEdge servers, in a series of blog posts. So stay tuned!

     

    Resources

    Nano Server Technical Preview 4 download: https://www.microsoft.com/en-us/evalcenter/evaluate-windows-server-technical-preview

    What's New in Windows Server 2016 Technical Preview 4 

    Getting Started with Nano Server: https://msdn.microsoft.com/en-us/library/mt126167.aspx

    Nano Server on Channel 9: https://channel9.msdn.com/Series/Nano-Server-Team

    How to use WDS to PxE Boot a Nano Server VHD: http://blogs.technet.com/b/nanoserver/archive/2015/06/03/how-to-use-wds-to-pxe-boot-a-nano-server-vhd.aspx

     For more information on Nano Server visit Dell Tech Center : Nano Server



  • Statistica

    Free Statistica at College: The Gift That Keeps On Giving

    We've all been to college at one time or another. Some of you reading this post are still in school even now. And the majority of us are probably still paying off student loans.

        

    Speaking of college costs, maybe you have already learned about Dell Statistica's response to students in need. Our answer: FREE academic software!

    Major Costs Add Up at School

    Ponder your college years for a moment. Good times and challenging courses. But let’s focus on the struggle of the whole college experience ROI. What are your top complaints in this regard? If they relate to costs, you are in broad company. A nationwide campusgrotto.com survey of higher education students reveals a list of popular complaints, with a measurable percent stemming from costs:

    • “The price of textbooks!”
    • “College is too expensive.”
    • “The cafeteria food is gross.”
    • “Being broke all the time.”

    Okay, we can't help you with the cafeteria food, but you'll notice the other complaints are indeed about costs.

    Additionally, a plurality (39%) of respondents to Princeton Review's recent "College Hopes & Worries Survey" said their biggest concern is the level of debt incurred to pay for a degree.

    It comes as no surprise that everything at college costs more money than we like, and it all adds up. Consider textbooks alone, the bane of every undergrad out there. Costs vary greatly from one major to the next, but assuming new book purchases are required, a study based at University of Virginia indicates that a statistics major is neither the most nor least expensive when it comes to textbooks. However, the study did find the average statistics textbook costs about $110, and students must buy multiple textbooks throughout that major's curriculum. The most expensive statistics book topped out at $342.

    And, as if that weren’t enough…students in the data sciences get to tack on the cost of basic analytics software, too. It's like buying a virtual textbook on top of the physical textbooks.

    What is the skills gap?

    Meanwhile, though it may vary from industry to industry, the data scientist skills gap is real. Even as long ago as 2011 McKinsey & Company was already reporting that there will be a shortage of talent necessary for organizations to take advantage of big data. Barring some kind of change in the human resources supply chain, they predicted by 2018 “the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.” This is great news for students looking to break into this career path.

    Change that Matters

    So, our free academic program in North America is the kind of “change” we can apply readily to impact that human resources pipeline at the university level. It may not sound like much, but remember that every little bit helps when we are talking about reducing the financial burden of students seeking a strong foundation with skills-based training and key software tools in order to increase their value in the competitive data science field.

    Think about it: The world needs more statistics and data science graduates to handle the deluge of big data challenges that are developing in every industry. Would the cost of just one more textbook—or, in this case, an analytics software package required by the professor—make or break the average student's ability to pursue the degree? Why risk it? We'll just give it away and let the chips fall where they may! If we choose to give away some software to help put more problem-solvers into the world’s workforce, then that's what we will do.

    And the value of such a program? Priceless! Not only is the free academic bundle a boon to the study of analytics in North American academia, but because it will expand the pool of graduates qualified for real-life analytical pursuits across industries, the effects of this program are literally immeasurable, with potentially world-changing impact. You just never know where the next genius case study will originate. Truly, the gift that keeps on giving.

    Read the Oct/Nov Statistica Newsletter >

  • Information Management

    IoT in Manufacturing: A #ThinkChat Recap

    Shawn Rogers, Joanna Schloss, and Paul Hiller joined the #ThinkChat to discuss IoT’s effect on Manufacturing.  What does it mean to your business?  How is IoT used day to day? What barriers exist in the adoption of IoT in manufacturing?  And More!  Check out the recap below of some of the highlights!

    Thanks for joining the conversation, and don't forget to mark your calendar for our next #ThinkChat on December 4th at 11:00 AM PST.  We'll be discussing how technology has transformed retailing during the holidays!  Follow #ThinkChat on Twitter and join in!

    View the whole #ThinkChat on Twitter

     

    What does IoT mean to you and your business?

    From Shawn:

    • “IoT "means" new avenues to data driven innovation. Higher levels of value from  analytics.  It also "means" new challenges to security and infrastructure. IoT is very exciting! I think industrial IoT is a bigger opportunity than the consumer side. Real value for enterprise insights."

    From Jo:

    • “IoT for me still means all devices that are connected to the Internet-which includes mobile phones, smart watches, and ATMs.  It is also a realization of value for the existing infra-structure investments.”

    From Paul:

    • “Business perspective of IoT kinda like dating, nervous: "Do I want to spend the rest of my life with her?"

    From the Community:

    • “By itself, IoT will provide some interesting products and services and business processes, but more is needed…” @JAdP
    • “P2… cross-industry solution spaces addressed only by the convergence of IoT with data management and analytics DMA…” @JAdP
    • “P3 … sensor analytics ecosystems SAE with communities and marketplaces for data and insights will form IoT DMA” @JAdP
    • “P4 IoT will have huge impact on design, manufacturing and logistics” @JAdP

     

    What does IoT mean to you and your business?

    From Shawn:

    • “IoT manufacturing is a deep opportunity. End to end data to drive quality and optimization is a great use case. Presents opportunity for companies to understand the life cycle of their products. Will they use that data for good or for evil?”
    • “Product quality assurance is more granular than the past. Easier to spot/predict issues & avoid downtime.”

    From Jo:

    • “Dell Gateways & other high tech fabs use IoT to connect systems to processes creating RT feedback loop.”

    From Paul:

    • “Product quality assurance can be much more granular than in the past. Easier to spot/predict issues & avoid downtime.”

    From the Community:

    • @GTNexus great example of manufacturing IoT with slant towards the deliver end of supply chain.” @StevnPhillips
    • “IoT already commonly applied to manufacturing for remote monitoring, increasingly 4 warranty/SLA mngt. Cost savings=ROI->adoption” @chris_rommel
    • “In almost every market vertical, we have seen customer use cases for IoT data and Sensor Analytics.  @JAdP
    • “Advances in pharmaceuticals going from research to production was only possible due to real-time Sensor Analytics” @JAdP

     

    On the day to day, how do you use IoT?

    From Shawn:

    • “Best way into #IoT is to enhance or optimize existing processes. No need to reinvent. Building automation, Manufacturing are two great places to start IoT.”

    From Jo:

    • People interact with IoT-unbeknownst to them with Mobile delivery of coupons, mobile payment plans, and of course SOCIAL Media...staying connected with people, institutions, and personal interests and data. IoT is pervasive.”

    From Paul:

    • “@DellBigData @shawnrog but competitors' IoT ecosystems will come knocking on your factory door. Gotta answer it.”

    From the Community:

    • “To forecast weather to place big bets on orange & pork belly futures. IoT bigdata for Trading Places.” @chris_rommel
    • “Future looking: agile design will create one-off custom versions matching my environment and duty-cycle…” @JAdP
    • “… with such one-offs CAD file being delivered to my 3Dprinting closet at home or work…” @JAdP
    • “For today, I just wonder why my health monitor and its app are so dependent on the Cloud with no Edge analytics.” @JAdP

     

    Are there any barriers to the adoption of IoT in manufacturing?

    From Shawn:

    • “IoT presents similar challenges as most IT projects. Must have scope control, defined goals and start small. IoT needs end to end infrastructure, security and edge analytics. Without them it lacks flexibility to perform.”

    From Jo:

    • “Currently the barriers are many - but the top ones to consider are dollars, people, and existing infrastructure.  Dollars - manufacturers need to have the money and services to revamp and assess the existing technology.”

    From the Community:

    • “Cultures of conservatism and long deployment/design-in cycles are biggest barrier to IoT in manufacturing. Patience over hype.” @chris_rommel
    • “Incumbency and inertia will always stymie innovation.” @chris_rommel

     

    What exactly is connectivity?  Is the IoT always “on” or can things connect themselves temporarily?

    From Shawn:

    • “IoT connectivity is driven by the devices on the edge, some are on all the time, some are smarter.”

     From Jo:

    From the Community:

    • “IoT Connectivity is the first step in IoT and casual connectivity is still the norm.” @JAdP
    • “One difference in IoT today is that connectivity is more likely to on-demand, when needed, not when available.” @JAdP

     

    What is the difference between “Internet of Things” and “many things with communicative apps?

    From Shawn:

    • It’s the sensors and the data types that make the difference between IoT and application data.”

    From Jo:

    • “IoT is Hardware+Software for a non-specific use case whereas apps are software+specific use case on non-specific hardware.”

    From the Community:

    • “It’s less about communication and more about the data being collected and analyzing it for greater effect http://bit.ly/1QuMDDI …” @Joshoward
    • “First we need to consider the difference between simply connected and connected sensors that truly communicate via #M@M” @JAdP

    For more information on IoT in Manufacturing, download our White Paper, "Key Considerations for Analytics Platforms in Regulated Manufacturing".  See you December 4th for our next #ThinkChat!

  • General HPC

    Meeting the Demands of HPC and Big Data Applications by Leveraging Hybrid CPU/GPU Computing

    “Rack ‘em and stack ‘em.”— a winning approach for a long time but not without its limitations. A generalized server solution works best when the applications running on those servers have generalized needs.

    Enter “Big Data.” Today’s application and workload environments can be required to process massive amounts of granular data and, thus, often consist of applications that place high demands on different server hardware elements. Some applications are very compute intensive and place a high demand on the server’s CPU where others in the same environment are tasked with unique processing requirements performed on specialized graphical processing units (GPUs).

    Whether it is customer, demographic, seismic data — or a whole host of other uses — the number crunching and processing required across the suite of applications can result in processing demands that are radically different from demands of prior years. Enter Hybrid High Performance Computing. These systems are built to serve two masters: CPU-intensive applications and GPU-intensive applications delivering a hybrid environment where workloads can be optimized and run-times reduced through ideal resource utilization.

    The results of Hybrid CPU/GPU Computing adoption have been impressive. Just a few examples of how Hybrid CPU/GPU Computing is delivering real value include:

    • Optimization of workloads across CPU/GPU servers
    • Delivering the highest-density, highest-performance in a small footprint
    • Provides significant power, cooling and resource utilization benefits

    You can learn more about leveraging hybrid CPU/GPU computing in this whitepaper.

  • Information Management

    Home in Time for Jeopardy – Simplifying Migrations and Upgrades, Part 3 [New e-Book]

     When you’re up to your ears in a database upgrade project, like an upgrade to Oracle 12c, you begin to look forward to little things.

    Like going a few hours with no system notifications or text messages from your DevOps team.

     

    Like buying lunch in the cafeteria and actually getting to eat it there, instead of at your desk.

    Like getting home in time to watch Jeopardy! with the kids.

    Of course, you could spend an entire career performing upgrades and running migration projects, and never hit all three of those, or any of your favorite little things. But looking forward to them is always the light at the end of the tunnel.

    Home for dinner and Jeopardy!

    If you want to achieve some of those little things while you’re in the middle of your migration or upgrade project, have a look at our new e-book, Simplify Your Migrations and Upgrades. We wrote it to give you some high-level perspective before you get bogged down in the project itself.

    Part 1 takes you through some of the basics on avoiding risk, downtime and long hours, including the five common pitfalls that afflict most migration projects:

    1. Poor planning — Before you jump in, analyze all your applications, processes and access requirements. That will let you begin the project confident that you’ll have adequate resources in place for the migration.
    2. Underestimating user and business impact — Don’t underestimate the effect of migration on your co-workers’ ability to do their job. Nobody can work efficiently when resource-intensive migration tasks are bogging down the network and servers.
    3. Inconsistent or missing strategy for coexistence — If you can’t pull off your entire migration at once, you’ll have to live with one foot in the old world and the other in the new one. Ask yourself how you’ll make coexistence work and for how long you’ll have to make it work.
    4. Inadequate data protection — Common sense calls for backing up regularly, but you can lose sight of common sense in an overwhelming migration project. And test for recovery while you’re at it, in case you suddenly need to restore data.
    5. Failure to focus on management — Scheduling, project management and progress reporting will ensure that you’re getting as much out of the new system as you thought you were going to. Continue with robust management even after you’ve finished the migration project itself.

    Keep in mind the best ways to avoid those pitfalls:

    • Insist on adequate time for testing.
    • Arm yourself with a backup plan.
    • Take advantage of the comprehensive, end-to-end support built into dedicated software tools..

    Have a look at the e-book for more ideas on structuring your migration and upgrade projects. A half-hour of Jeopardy! with the kids beats a half-hour of DNS changes and node restarts any day.

    Steven Phillips

    About Steven Phillips

    With over 15 years in marketing, I have led product marketing for a wide range of products in the database and analytics space. I have been with Dell for over 3 years in marketing, and I’m currently the product marketing manager for SharePlex. As data helps drive the new economy, I enjoy writing articles that showcase how organizations are dealing with the onslaught of data and focusing on the fundamentals of data management.

    View all posts by Steven Phillips | Twitter