Dell Community

Blog Group Posts
Application Performance Monitoring Blog Foglight APM 105
Blueprint for HPC - Blog Blueprint for High Performance Computing 0
CommAutoTestGroup - Blog CommAutoTestGroup 1
Custom Solutions Engineering Blog Custom Solutions Engineering 6
Data Security Data Security 8
Dell Big Data - Blog Dell Big Data 68
Dell Cloud Blog Cloud 42
Dell Cloud OpenStack Solutions - Blog Dell Cloud OpenStack Solutions 0
Dell Lifecycle Controller Integration for SCVMM - Blog Dell Lifecycle Controller Integration for SCVMM 0
Dell Premier - Blog Dell Premier 3
Dell TechCenter TechCenter 1,853
Desktop Authority Desktop Authority 25
Featured Content - Blog Featured Content 0
Foglight for Databases Foglight for Databases 35
Foglight for Virtualization and Storage Management Virtualization Infrastructure Management 256
General HPC High Performance Computing 226
High Performance Computing - Blog High Performance Computing 35
Hotfixes vWorkspace 57
HPC Community Blogs High Performance Computing 27
HPC GPU Computing High Performance Computing 18
HPC Power and Cooling High Performance Computing 4
HPC Storage and File Systems High Performance Computing 21
Information Management Welcome to the Dell Software Information Management blog! Our top experts discuss big data, predictive analytics, database management, data replication, and more. Information Management 229
KACE Blog KACE 143
Life Sciences High Performance Computing 6
OMIMSSC - Blogs OMIMSSC 0
On Demand Services Dell On-Demand 3
Open Networking: The Whale that swallowed SDN TechCenter 0
Product Releases vWorkspace 13
Security - Blog Security 3
SharePoint for All SharePoint for All 388
Statistica Statistica 24
Systems Developed by and for Developers Dell Big Data 1
TechCenter News TechCenter Extras 47
The NFV Cloud Community Blog The NFV Cloud Community 0
Thought Leadership Service Provider Solutions 0
vWorkspace - Blog vWorkspace 510
Windows 10 IoT Enterprise (WIE10) - Blog Wyse Thin Clients running Windows 10 IoT Enterprise Windows 10 IoT Enterprise (WIE10) 3
Latest Blog Posts
  • SharePoint for All

    #SharePoint2013HardwareRequirements - A Consideration #qsharepoint #sp2013

    We’re still very early in the launch countdown for SharePoint 2013.

    I’ve been lucky to start presenting and demoing SharePoint 2013 recently. Early in each presentation, I show Microsoft’s minimum support levels for installing SharePoint:

    Server Operating Systems

    • Windows Server 2008 R2
    • Windows Server 2012

    Memory

    • 8GB RAM (Foundation)
    • 12GB RAM (Farm Server)
    • 24GB RAM (Single Server)

    http://technet.microsoft.com/en-us/library/cc262485.aspx

    Database Servers

    • SQL Server 2008 R2
    • SQL Server 2012 “Denali”

    8GB RAM PLUS

    “Wow!” people say, “Microsoft sure raised the specs!” Well, yes -- and no.

    For planning, you’ll be moving to a “clean” system anyway – there is no in place upgrade. As a result, think of the process as a migration, not just an upgrade. At a system level, if you were already running Windows 2008 R2 and SQL Server 2008 R2, you’re fine. But there were a lot of installations on Windows 2008 and SQL 2005 in the 2010 world. If you’re already moving to new hardware, there may be little if any benefit in upgrading the systems you’re leaving. And if you're starting with Microsoft's public preview of SharePoint 20103, check out Quest's free SharePoint 2013 Migration Suite.

    Also, those memory requirements are, officially, larger the 8GB recommendations for SharePoint 2010. But they’re really no different than what folks in the community – including me – having been writing and speaking about for the past three years. If you want the best performance – add memory. For good user experience on a SharePoint 2010 Enterprise system – as well as 2013 – you’re usually looking at 16-24 GB RAM.

    Can SharePoint 2013 run in less memory? Sure! I’ve launched full single server demos in 8GB, 4GB – once only 2GB! But it’s slow – even in single user mode – and will not give acceptable enterprise class performance at scale. These recommendations are merely “official-izing" what’s been commonly understood for a while.

  • SharePoint for All

    Migration Tip of the Week: InfoPath List Forms

    InfoPath Forms Services, which was introduced in the Enterprise edition of SharePoint Server 2007, gives us a lot of power in terms of enhanced data collection in SharePoint applications. You can publish an InfoPath form to a Form Library to capture and process rich XML forms which are rendered right in the browser. InfoPath 2010 has been integrated with SharePoint 2010 even further. Now you can replace standard ASP forms that come with lists and libraries (i.e. display, edit, new) with your own forms that take advantage of the advanced features in InfoPath. The best part is that you can design those forms without writing any code.

    As I am a migration guy, I'm primarily interested in how my customers can keep those cool InfoPath forms when migrating their sites and lists using Quest Migration Suite for SharePoint.

    The answer is that you can transfer InfoPath list forms to their new location using Migration Suite, SharePoint Designer and InfoPath. Wondering how? Read on!

    1. The first step would be using Migration Suite to copy the list to a new location. In order to copy the form files, you will need to check the Copy SharePoint Designer Objects option in Profile Manager:

      In case you wonder what Copy InfoPath does, it includes InfoPath forms when you copy forms libraries.
    2. Next just copy your site or list. You can veryfy that the form files were copied successfully in SharePoint Designer 2010 or in Migration Suite directly (use the Show SharePoint Designer Objects command in the right-click menu on the target list). The files should be located under the folders corresponding to attached list content types:
    3. If you try to use the forms now (e.g. add a new list item), you will still see the standard list form. To fix that, we will need to use SharePoint Designer 2010.
    4. Connect to the target site in SPD and go to All Files
    5. In the All Files view, locate the list folder under Lists and then drill down to the Item subfolder (your CT may differ):
    6. Double-click on the InfoPath form template file (template.xsn) to open it in InfoPath 2010
    7. InfoPath will show a warning message, asking you to update the form fields. Click yes and re-publish the form
    8. What InfoPath does at this point, it adds a new form template to the list and another Browser Form web part into each of your form ASPX files. This prevents form files from being rendered properly. So we need to fix this little problem in SharePoint Designer.
    9. Go back to All Files and open the display ASPX form for editing in code view. Most of the file contents in the file (highlighted) is coming from a master page. You will need to scroll down to the Zone Template element.
    10. Next you just comment out the web part that points to the original form template (template.xsn) and save the file:
    11. Repeat 9 and 10 for each ASPX form.

    You're done!

    Naturally there is no limit to creativity of form designers. You might run into forms that use data connections and other advanced capabilities, which will require a few extra steps to make the form work. I hope this posting arms you with a general technique to approach InfoPath list forms migration.

  • Dell TechCenter

    Motherboard Replacement on PowerEdge Servers

    Dell has published a whitepaper on Motherboard Replacement using the iDRAC7.  Here is a quick overview of what you can look forward to in the paper:

    This feature makes an IT admin's life easier in the event that the motherboard on a Dell server has to be replaced.

    Trying to manually write down all system settings and configuration is tedious, time consuming, and error prone.  Using Dell's export (backup) and import (restore) features makes returning a system to its original state automatic.

    The procedure to accomplish this task are as follows:

    1. Create a backup image of a system in case of motherboard replacement later on
    2. Replace broken motherboard with new one
    3. Restore desired image onto new motherboard using either of following options:

    What is restored?

    • Service Tag (Motherboard replacement only)
    • Component firmware
    • Component configuration data
    • Component usernames, passwords, certificates, and licenses
    • USC

    As you may discover later, this method is an API but it can be easily scripted using tools already available in the OS such as winrm or client libraries such as from the opensource OpenWsman project.

    See the full article in this whitepaper:

       Motherboard Replacement on PowerEdge Servers



    iDRAC with Lifecycle Controller Technical Learning Series.

    Lifecycle Controller Home : iDRAC7 Home

     

  • High Performance Computing - Blog

    Chattanooga a Technology Leader with HPC at SimCenter

    From more efficient Boeing turbine engines, to reducing drag on commercial trucking company vehicles to reduce pollution and save millions of dollars on gas costs, the SimCenter in Chattanooga is leveraging high performance computing (HPC) to make a difference in all areas of our world today. In a recent article in the startup publication called Nibletz the author explores a variety of problems the SimCenter is addressing with its HPC resources. Read the full article here: Chattanooga’s SimCenter Could Use The Gig To Plan For The Zombie Apocalypse And More

    One of my favorite parts of the article is when the SimCenter was used to simulate a hazardous materials spill in a metropolitan area with the goal of containing the disaster, and preventing greater loss of life. The simulation has been unofficially referred to as a Zombie simulation, apparently helping make sure less of our citizens become one!
     
    And while you may not initially think of Chattanooga as leading high tech hub, according to the article, the city was the first to have a 1GB fiber optic network. I was surprised & impressed to learn that fact. Given the city's technology savvy nature, it's not a stretch to see how this HPC site in Tennessee is working with some of the biggest companies in the world, as well as government entities like the Department of Defense, and even providing a resource for weather and coastal flooding for public safety.
     
    It's exciting to see HPC making an impact not only everywhere in our daily lives, but also in different locations across the U.S. Congratulations to Chattanooga for being a technology leader, and recognizing the power of supercomputers to help make a difference.

  • Dell TechCenter

    Chat with Us about Windows Server 2012 on Sept 4th, @ 3PM CDT

    This post was originally written by Michael Schroeder from the Dell Operating Systems Engineering team (OSE)


     

    Join the Dell Operating Systems Engineering team to learn about the latest developments with Windows Server 2012 running on Dell Products.

    Join us at 3pm Central time on September 4th, 2012 @ http://www.delltechcenter.com/chat

    From the behind the scenes development efforts that helped shape Windows Server 2012 to the partnership between Dell and Microsoft to qualify the OS on Dell products, the OSE team has been heavily involved with Windows Server 2012 for several years. We’ll cover as many new features as we can. We would also like to hear from you what new Windows Server 2012 features you are most interested in.

    Here are a few of the topics we will cover:

     

    Base Platform and BIOS Support for Windows Server 2012:

    Hyper-V:

    • Scalability Improvements
    • Single Root I/O Virtualization (SR-IOV)
    • Dynamic Virtual Machine Queues (DVMQ)

    Networking:

    • Consistent Device Naming (CDN)
    • NIC Teaming

    Storage:

    • Offloaded Data Transfer (ODX)
    • Storage Management  

    Systems Management:      

    • OpenManage 7.2 support
    • Cluster Aware Updating
    • iDRAC/PowerShell CIM cmdlets

     
    Join us at 3pm Central time on September 4th, 2012
     to learn more about Microsoft Windows Server 2012 & Hyper-V 3.0 !

     

  • Dell TechCenter

    A lightweight WSMAN client for Ubuntu (WSL)

    As Jared pointed out in his blog post, there isn't a readily available package for wsmancli, a popular client tool for WSMAN. Thanks to him for providing an install package for Ubuntu 12.04. But what about a pakage for other Ubuntu releases? a package for other Linux distro?

    Introducing WSL!

    WSL* (pronounced "whistle") is a set of shell scripts that serve as a commandline client to WSMAN. It only requires tools already present in standard Ubuntu installation. There is no binary to worry about. No need to compile or build a package for a distro or version.

    To verify, I booted an Ubuntu 10.10 Live CD and from a shell...

         $ # Download_the_zip_package
         $ # Unzip the file
         $ # Untar the content
         $ cd dcim
         $ ./wsl enum DCIM_SystemView
           # you'll be asked for IP address and credential

    That's it, you got a working WSMAN client.

    Make sure you visit the WSL wiki for complete information.

    The WSMAN target I used is PowerEdge R720 and R610 with Lifecycle Controller in iDRAC. There's a lot more you can get from this interface and you might be surprised what you missed. This video blog may help even as a refresher.

    Like it?  Have a suggestion?  Leave a comment below.


    Cheers!


    * An open source community-based project.

  • SharePoint for All

    Getting Ready for #SharePoint 2013 – Webinar Followup @cmcnulty2000 #qsharepoint #sp2013

    We had a fantastic crowd online the other day for our initial webcast in our series on preparing for SharePoint 2013 migrations. The SharePoint 2013 public preview (beta) has already attracted a lot of attention. And Quest is also ready with our free SharePoint 2013 Migration Tool.

    In getting ready for SharePoint 2013, the major activities are:

    • Establish a Governance Program
    • Adopt Code Fee Customization
    • Inventory And Assessment
    • Externalize Before Migration
    • Content Consolidation

    The recorded webcast is available now at http://www.quest.com/webcast-ondemand/five-ways-to-prepare-for-sharepoint-2013817953.aspx

    We had some great discussions during Q&A, but of course we ran out of time for all of them. Here are some answers for the rest of the audience:

    Q: Is unique permissions and folder structures really a thing of the past and shunned upon when it comes to documents in SharePoint? What is the best route for a company whose users rely on the folder structure to move to a metadata tagging philosophy?

    Yes and no. Unique permissions are absolutely part of the plan moving forward – the Share button, in fact, makes it easy for users to set unique permissions and send links to SharePoint content. Folders are de-emphasized, but they can still be used to provide subsets of content types or other default metadata options. I’ve written about this before, but one of the best ways to move to a “tagging” philosophy is to predefine a small set of initial tags as taxonomy terms in the Term Store, instead of waiting for users to create their own keywords.

    Q: Any introduction of new web parts? are there lots of development going on for the store for SP? I have not personally visited that area yet.¬

    The new application model is one of the exciting frontiers for SharePoint 2013. Quest is well underway with activities to introduce new solutions to the online marketplace, and although I can’t comment on our specifics, please stay tuned.

    Q: Will there by actual mobile apps, or just specially formatted mobile versions of pages? Also, general cross-browser compatibilty (e.g. html5)?

    In general, the ISV community is using the new architecture to develop many application shapes and formats – including apps that are designed to be hosted inside SharePoint, Office, or as part of w Windows (8) desktop or mobile app. HTML5 is a major design pattern being emphasized moving forward. Developers who adopt it (Quest does!) will be able to have those app elements addressed on a variety of client systems, such as iPad, Windows 8 or WP7 (Windows Phone.)

    Q: to add to my question: really wanting to know if the development community is embracing that initiative/offering¬

    This varies – since the app architecture steers us to run the apps “off-server” in a new API, there’s a range of opinions. The old on-server API is still supported, so you’ll see a range of early adopters as well as legacy coders.

    Q: We have 10TB data on our cifs average file size is about 1.55mb total docs 6658580 - Candidate for RBS?

    1000% yes. A fully externalized content database would still have a large-ish foot print – but capacity and performance would be massively improved with RBS. Quest Storage Maximizer for SharPoint can help manage your RBS environment.

    Q: I hear SP 2013 is the last major release for SP and that Microsoft will release feature enhancements instead. Can you confirm that is the philosophy going forward?

    Microsoft has not announced this. I think it’s fair to anticipate more frequent feature updates to the online Office 365 platform with consolidated feature rollups to the on –premises version. But I would still expect to see SharePoint 2015/Wave 16 in a few years.

    Q: BI- How are the data warehouses generated?

    Typically these are built in SQL Server Analysis Services (SSAS) by a developer. Other line of business systems, such as Microsoft Project Server, include SSAS cubes in the base build. Quest will provide access to our SharePoint management and metric data through our Site Administrator solution.

    Q. Would non-collaborative workloads include site and page customiztions with custom master pages and layouts, like custom branding on a set of sets?

    Sure – but this usually refers to workloads like BI, social, search, workflow, forms, and custom application development.

    More SharePoint Sunrises (c) 2012 Christopher F. McNulty - Flickr

    Thanks again to everyone for participating, and please stay tuned for more sessions in our SharePoint 2013 readiness webinar series.

  • Dell TechCenter

    Dell PC Diagnostics Just Got Even Better

    This blog post was orginally written by the Dell eSupport team 
     
    Dell PC Diagnostics is a free, self-service, diagnostic tool that provides a set of robust hardware diagnostics to Dell Home, Small Business, Public and Large Enterprise customers world-wide. 
     
    Features of Dell PC Diagnostics include:
    • ability to run self-diagnostics on Dell PCs at a convenient time
    • Targeted troubleshooting recommendations for unique hardware and symptoms
    • Easy solutions to the most common problems
    • Automated online dispatching for hard drives and memory in the US and Canada for diagnostic failures
     
     
    On August 22nd, 2012 Dell eSupport teams launched enhancements to this innovative online tool. Dell PC Diagnostics now supports all major browsers, including Internet Explorer 6 through 10, Chrome, and Firefox. In order to improve customer experience with a shorter diagnostic run time and better recommendations, the tool also provides more scan options to choose from:
    • targeted scans based on symptoms of commonly reported issues
    • single hardware scans for quick pinpoint accuracy
    • complete system scans for routine maintenance and checkups
     
    Visit the Dell PC Diagnostics page to download and evaluate this free tool.
  • Dell TechCenter

    Dell DTCUG-IP at VMworld 2012: Thank you for your support.

    DSCN1369 DSCN1372

    It goes without saying that the Dell TechCenter is there for the customers.  So to say thank you we hosted our fourth annual Dell TechCenter User Group (DTCUG) at VMworld, a way to give back to you for keeping our community strong and vibrant.  We want you to know how much we appreciate every question, answer, post, response and comment by giving you what you want out of a conference, knowledge.  That is why we had Jason Boche, Todd Muirhead and Dave McCrory present some new and compelling information special just for you.

    DSC_4251DSC_4257

    DSC_4267DSC_4250

    Last nights event was kicked off with Jason giving us a demo and overview of Dell Compellent and the Storage Center integration with VMware VASA, followed by Todd who told us about what is news in performance on vSphere 5.1.  Dave took the stage to talk to everyone about a different way to look at data and applications that use data, something he calls Data Gravity. Finally we closed the night with some great prizes with 5 winners taking home SonicWall firewalls.  Congratulations to the winners and to the speakers for another great event.