Harnessing Big Data is an important strategy for all companies in today’s information-driven world. But where do you begin in choosing a project to utilize this vast resource effectively?Much has been written about the three Vs of Big Data – Volume, Velocity and Variety. But the unsaid fourth V – Value – is the one where we have to direct our focus. It only makes sense to start where Big Data analytics will have a big impact right away. At EMC, we have seen that there are at least three types of use cases to consider — those involving business data (customer data, product data, quotes and orders, financial data), data center data (events from networks, storage, servers, applications) or security (events from firewalls, IDS, antivirus, etc.). Chances are, like EMC, your company has already taken steps to consolidate your data and found that traditional data stores or standard analytics tools still do not provide the agility that you need.At EMC, for our first Big Data use case, we chose to focus on the business side of the house in a project to bolster our Customer Quality (CQ) effort. This is a critical focus that touches on all aspects of our business. Utilizing Big Data analytics here promised to be an early and substantial win.The CQ team tracks quality and analyzes performance data, including information on data availability sent back to EMC from all hardware installed in customer environments via a dial-home system. We use this data, along with customer surveys, to help gauge customer satisfaction.Leveraging EMC’s Greenplum Unified Analytics Platform for data quality provided both the technology to quickly load incoming data and the tools to generate reports in near real-time. The results were undeniably fantastic. The CQ team can now load millions of dial-home records generated in a span of minutes vs. the four to six days that it used to take. They can also produce reports they need on demand and pursue questions and shape reports to meet their specific needs like never before.What’s more, we have built on this experience to offer Business-Analytics-as-a-Service (BAaaS), a model we recently introduced to allow business units to work in their own analytics workspace with information from our data warehouse and other sources to pursue research questions they never could before.While this project has really shown the power of Big Data, it is only the beginning of our journey to leverage the benefits of Big Data analytics and BAaaS. We have also made significant progress in predicting certain types of failure events within the data center hours before they happen and in dealing with security threats. With the help of data science we expect to unlock the predictive value of our data to hone our business strategy for the future.
We are living in an extraordinary time. The convergence of technology vectors in recent years, including the advancement in microprocessors, storage, networking, virtualization, and the cloud, has presented one disruptive shift after the other, empowering remarkable people and organizations to push the limits experientially, conceptually, and socially.When I spoke at the first O’Reilly Strata Conference in 2011, the big question was, “How does this burgeoning industry accelerate its momentum and continue to push its limits?” In the past two years, we’ve seen data-derived insight drive business value and impact many sectors, with McKinsey projecting that the industry will grow to $16 billion by 2015. And its impact goes far beyond the enterprise. Silicon Valley veteran Marc Andreessen recently stated that “software is eating the world,” meaning that the world is getting instrumented through the explosion of machine-generated and mobile data. Governments are also embracing Big Data solutions, and the topic is making front page news in mass media outlets across the globe. Within the cultural conversation, the concept is on the tip of the tongues of people around the world.As we look back at the progress made over the last several years, the one fundamental thing that is consistent is that Big Data is about people and the notion of being part of a larger community. The impact of the efforts of our co-conspirators in this data community — the data scientists and developers who have been instrumental to making data work — cannot be overstated.At Strata 2011, I observed that this space needed more data scientists, government support, industry investment, and startups. During my keynote last month at the 2013 Strata Conference, I paid tribute to ten such practitioners, change-makers, institutions, and entrepreneurs who have inspired me and the Greenplum team since then.DataKind’s founder Jake Porway brought data science to non-profit organizations to address social and civic causes. Cornell University’s Jon Kleinberg reshaped how we think about machine learning algorithms. Andrew Ng and Daphne Koller of Coursera introduced data science and technical education to the world. Alpine Data Labs’ Steven Hillion, who founded the analytics team at Greenplum, makes the list, as do Metamarkets CEO Mike Driscoll and Big Data trailblazer Todd Papaioannou. Cowbird founder Jonathan Harris has shown us new ways to tell stories through data and collaboration, while at the government level, the Obama Administration has taken a proactive stance on Big Data, with $200 million budgeted to fund Big Data research and development across six Federal agencies.So what do we need to make data work in the future? More of what has taken us so far already: collaboration.When I talk about collaboration, I mean it on the same level as a word as powerful as democracy. This means collaboration on software, standards and data. Open source software is not the open source of twenty years ago — not when you have hundred million dollar investments by venture capital companies expecting a billion dollar return. Within any open source project, there will be conflicts about how that project evolves, which is why the development of standards to ensure compatibility and interoperability will be key.Another critical piece of collaboration is around the data itself. My data and your data are contributing to a unified data set around the world. Examples include healthcare, energy, local economies, global security and the future of education. I believe that within this global data set are insights to some of the world’s biggest problems, revealing significant breakthroughs that our civilization is looking for. This is what keeps us inspired and what drives us to be part of this important community today.Watch Scott’s full keynote below.
Many CIOs today focus their attention in two key areas:Increasing agility in deploying and delivering enterprise applications to business constituentsRe-architecting the data center to be more cost-effective and efficient by bringing databases into the fold vs. being a siloImagine a CIO at a large financial institution – their customers are global, whether they are individual traders or financial institutions that manage huge pension funds. They demand instant access to their services to trade and manage funds in real time. The underlying infrastructure, including databases, must process millions of transactions each minute with near zero downtime to keep them not only viable, but also profitable.“Just a millisecond reduction in response time could result in decreased profits for a financial services organization and could mean an economic disaster for their customers.ShareThis CIO’s data centers are not only mission-critical to the financial trading company, but to the broader economy. Countries, enterprises and individual consumers could be financially devastated if the institution’s database went offline for a few minutes, rendering millions of transactions incomplete. As the CIO, the customers’ trust and company’s reputation live and die with the speed, reliability, agility and availability of the data center.Another important requirement is security and data protection. For example, financial organizations are expected to meet the Dodd-Frank Wall Street Reform and Consumer Protection Act, which is a regulatory compliance law that enforces transparency, accountability and consumer protection for financial processes – such as trades and settlements.Regulatory requirements weigh heavily on CIOs; trading institutions must save records for five years, which requires massive amounts of data to be backed up, archived and retrieved at a moment’s notice. Traditional infrastructure methods for record retrieval can take days or even up to several weeks to access across multiple archived silos, resulting in delayed response to audits or internal investigation needs. Additionally, new regulation requiring intraday liquidity management reporting and compliance are driving more demands for large, agile system deployments.Without a doubt, addressing the high performance and low latency needs of business applications including databases as well as security and availability in the data center are all essential factors for keeping these businesses viable and profitable. But a financial trading institution CIO is also looking to re-architect his or her data center to be more efficient to lower costs.In the past, CIOs kept mission-critical business applications on isolated Oracle, SAP or Microsoft databases because of their unique requirements for high performance, availability and reliability, but that paradigm is shifting. It’s no longer cost-effective or efficient for the trading institution to host 200+ applications on numerous databases, as well as:Multiple (regional/geographic) instancesExcessive heterogeneity/lack of standardsOverlapping functionalityMultiple operational/management solutionsIncreased risk of data loss/security breachesDifficult and costly regulatory reportingIncreased capital and operational costsConsolidation of business applications and databases is becoming the norm.ESG Lab validated impressive performance results about the Vblock Specialized Systems for High Performance DatabasesAs the leader for Integrated Infrastructure Systems,[footnote]Gartner Market Share Analysis: Data Center Hardware Integrated Systems, 2013 and http://www.vce.com/landing/gartner[/footnote] VCE converged infrastructure helps customers deliver business applications and services faster while consolidating data centers to be more cost-effective and efficient. Databases are one of the top workloads that are deployed on VCE Vblock Systems today and we have a broad portfolio of systems to address their needs.VCE Vblock® Specialized Systems for High Performance Databases is specifically tuned to enable financial institutions and other enterprise customers to meet the highest performance, reliability and availability requirements for online transaction processing (OLTP) and online analytical processing (OLAP) while helping to reduce risk and costs. The system also lowers the time it takes to recall archives – from several weeks to only two or three minutes (or even less).The Enterprise Strategy Group (ESG), an independent industry analyst firm, recently performed an ESG Lab validation of the Vblock Specialized Systems for High Performance Databases, finding that VCE provides:[footnote]ESG Lab Review, Vblock Specialized Systems for High Performance Databases, October 2014[/footnote]Extremely high throughputs of 4.1 million sustained read IOPS per second22.5 terabytes per hour data load rate to migrate to new environment38.7 gigabytes per second table scans providing faster insights into dataApplication Latency <700 microseconds against 11.2 terabytes of dataVCE also enables financial institutions to utilize the same infrastructure resources for consolidating other enterprise applications in the data center (e.g., internal or backend marketing applications). The ESG Global report found that this specialized system is ideal for consolidating non-Oracle applications, multiple Oracle databases, even those with different versions, all of which aid in minimizing deployment, upgrades, licensing and on-going operational costs.
I was struggling with the right way to convey my excitement in this blog when I ran across this quote from the late senator and astronaut, John Glenn:“To sit back and let fate play its hand out and never influence it is not the way man was meant to operate.” That’s exactly the challenge and the thrill I wanted to describe.After decades of working with our customers in every industry you can imagine, we still weren’t satisfied with the way our customers struggled to deploy their PCs, especially in large quantities and for remote and mobile employees. We invested in global research to zero in on the most difficult phases in the process. Sure, the results identified the phases we all know and…well, not love. But the research also helped prioritize the hurdles. Configuration – imaging, BIOS, tagging etc. – was number one, but scratch the surface just a bit more and the research revealed a hidden desire shared across almost all of the respondents: more direct control of and visibility into the whole process.The bottom line: IT managers need to focus on their critical initiatives instead of deployment tasks and employees need a fully configured system that works right out of the box – with the launch of the ProDeploy Client Suite, we’re delivering that experience!By alleviating and/or automating time-consuming, labor-intensive deployment tasks and addressing planning, dashboard visibility, configuration, integration and post-project needs, we’ve enabled our customers’ administrators and service partners to deploy PCs with greater speed, less effort and more control…to empower their doers quickly and easily.Plus, depending on the customer’s needs, the suite has three offers that align to common deployment scenarios:Basic Deployment: We prepare the systems for deployment in the factory. They arrive with the image loaded, BIOS configured, and asset tag applied.ProDeploy: includes all the features of Basic Deployment and coordination of all aspects of installation and configuration of hardware and system software, including up-front planning, 24×7 onsite installation, and post-deployment knowledge transfer. You can also use our ImageAssist tool to quickly create, deploy and maintain a single cross-platform dynamic image.ProDeploy Plus: has all the capabilities in ProDeploy and a distribution point for Microsoft System Center Configuration Manager in the factory, data migration with secure data wipe of legacy systems, training credits, 30-day post-deployment support, and a dedicated ProSupport Technical Account Manager.1For customers who want to have control and visibility of the project, there’s TechDirect, a self-service portal that allows customers and partners to easily direct and control their projects with less risk of mistakes, making the entire engagement more efficient and effective.The results? A third-party lab tested ProDeploy Plus and was able to deploy PCs up to 35 percent faster.2 Couple that with IDC’s finding that ProDeploy Plus provides a cost savings of up to $620 per PC3 and you have an absolute winner. How’s that for “influencing” your fate?This is also good news for our channel partners and the customers they serve. Our channel partners have the flexibility to resell or co-deliver these services and are now fully enabled with a framework designed to supplement their capabilities, grow their services revenue and deliver their customers the best possible deployment experience.We are not “sitting back and letting fate play its hand out.” We identified the challenge and went to work. With ProDeploy Client Suite, we are providing the right tools, processes and experts for a powerful deployment “one-two” punch, reducing risk and saving significant time and cost. Want to learn more? There are links below or send me an email.The ProDeploy Client Suite is available for Dell Latitude, OptiPlex and Precision systems in 70 countries across the globe. Check out more details in today’s press release, and for the latest news from Dell EMC Services, follow @DellEMCServices or @DellProSupport on Twitter.1Available for ProSupport Plus customers who qualify for a Technical Account Manager2Based on May 2016 Principled Technologies Report commissioned by Dell. Testing results extrapolated from a 10-system deployment to project time savings for larger deployment compared to in-house manual deployment. Actual results will vary. Full report: http://bit.ly/2hgCBrO3Based on IDC White Paper commissioned by Dell, “The Business Value of Utilizing Deployment Services,” July 2016. Results derived from a survey of 550 organizations. Savings calculated on a Tier 3 deployment. Cost savings in U.S. Dollars. Actual results will vary. Full report: http://dell.to/2gnfFFX
More and more, as the world’s digital transformation accelerates, technology isn’t just a business enabler, it is the business. Take FedEx, for example. Pioneering founder and CEO Frederick W. Smith was among the first to recognize that technology defined his business model when he described FedEx as “an IT company that ships boxes to pay for all its infrastructure.”To be sure, FedEx didn’t sit still, nor did competitors like UPS, DHL or even the U.S. Postal Service, but kept investing in its infrastructure for speed and agility. Today all of these companies continue to invest in keeping their server infrastructures fresh because they know it pays dividends to do so. They do it not only to boost company speed and lower costs, but also to enable new applications and services that might not otherwise be possible. Plus, they gain increased agility necessary to respond to new opportunities.IDC Makes the Business Case for Refreshing Servers Every Three YearsAccording to market research firm IDC, the operating costs of a data center’s servers rise dramatically after their third birthday. In fact, in years 4–6, the cumulative operating costs of older servers can be 10x more than an old server’s initial expense. Not only is that more than enough to pay for new servers, but the latest server technology can also provide far more performance, scalability and agility to manage workloads.In a study based on interviews with 14 companies averaging $8 billion in annual revenues, IDC made several findings that support a compelling business case for refreshing servers more frequently. One was that companies could save $14.6 million with two refresh cycles — each three years long — over a six-year span, which has been the traditional refresh cycle. Another was that the added speed, scalability and agility can help these firms generate an average of $4.7 million in extra revenue. Finally, IDC found that more savings and revenues meant the ROI on the cost of new servers was less than a year.Dell EMC Responds to Customers’ Needs With 14th-Generation ServersIn July, Dell EMC announced the availability of its latest PowerEdge servers, the 14th generation. Our newest lineup of five new PowerEdge server models incorporate several years and hundreds of millions of dollars in R&D investment to meet the kinds of IT transformation needs that IDC noted in the study just mentioned.So what’s new? While their innovations are many, they fall into three key categories:Optimized, scalable business architecture. In the past, servers were designed to handle multiple workloads, treating those equally, even if each one was vastly different. Server model choices involved processor speed, memory and density.Today’s 14th-generaton PowerEdge servers provide all that using the most advanced technologies available, but each model’s architecture and engineering are optimized for specific categories of workloads, especially in terms of workload acceleration, storage and tuning. Examples include mission-critical databases; virtual desktop infrastructure (VDI) and machine learning; big data and software-defined storage; and high-performance computing and high-frequency trading.For example, compared to PowerEdge R720 servers that used traditional, shared HDD storage, the new PowerEdge R740xd servers sport new, low-latency Non-Volatile Memory Express (NVMe) technology in a VMware vSAN environment. This optimization can enable them to process up to eight times more database operations per minute.Intelligent Automation. The new PowerEdge servers can help companies further transform their IT efficiency by automating the entire server lifecycle from deployment to retirement with embedded intelligence that dramatically increases IT staff productivity. With expanded APIs and an all-new OpenManage Enterprise console, IT team members can spend more time on higher-priority work.What’s more, the enhanced iDRAC 9 delivers up to four times better systems management performance over the prior generation of PowerEdge servers. Joe Wiese, a systems design engineer at Rackspace Hosting, says:“I totally love the new iDRAC 9. The new version is quite comprehensive and can be used to configure most everything for managing and maintaining the servers…the iDRAC, the PERC, the NICs and the BIOS all through a single web interface. It is hands-down the single greatest addition to PowerEdge that I have encountered.” Integrated Security. With cyber threats growing in frequency and sophistication, the newest PowerEdge servers offer a deep layer of defense built into their hardware and firmware. They also feature quick detection to enable, if breached, a fast recovery to a trusted base.Among other security features is System Lockdown, an industry-first. It prevents configuration changes that can create security vulnerabilities and expose sensitive data. In addition, features such as Secure Boot, BIOS Recovery capabilities, signed firmware and iDRAC RESTful API (compliant with Redfish standards) provide enhanced protection against attacks.Where Do You Stand? Assess Your Infrastructure and Learn MoreFor those of you interested in assessing the performance of your servers and workloads, Dell EMC offers its Performance Analysis Collection Kit, also known as DPACK.Based on an industry-standard method of impartially documenting the performance of one server or a group of them, DPACK is a cloud-based, IT infrastructure–planning software tool that you can download from Dell for free. This analysis service is hardware- and platform-agnostic. It records workload characteristics, measures performance and creates simulations from various industry-leading server platforms.And if you haven’t had the chance to read it yet, we invite you to download the free, 20-page IDC white paper, “Accelerate Business Agility with Faster Server Refresh Cycles.” Finally, for more insights into the new Dell EMC 14th-generation, workload-optimized PowerEdge servers, check out this recent blog post from Brian Payne, Dell EMC Executive Director of Product Management.
Today, Dell EMC is expanding its modern protection storage portfolio with Dell EMC Data Domain DD3300, a new platform specifically designed to deliver enterprise-level data protection to small to mid-size organizations and remote/branch office environments of larger enterprises.Why Data Domain DD3300 and Why Now?Organizations today face many challenges and conflicting priorities including data growth, supporting an ever-growing number of applications, an increasingly stringent regulatory and compliance environment, and continuously shrinking budgets. For organizations of all sizes, small to large, it is paramount for business success, more than ever before, that their data is protected and that they can easily leverage cloud for flexibility agility and economics.Achieve Enterprise-Level Data Protection Without an Enterprise-Size Data CenterData Domain DD3300 is purpose built for modernizing data protection of small to mid-size organizations and enterprise remote/branch office environments. DD3300 is simple and comprehensive, cloud-ready, and offers multi-site scalability.DD3300 offers Dell EMC’s comprehensive data protection capabilities, including inline encryption and DD Boost for faster backups and lower network usage. DD3300 also provides coverage for a wide application ecosystem, from enterprise to homegrown applications. A 2U appliance that enables you to start small and expand capacity as your needs increase, and with an average data reduction rate in the range of 10-55x, the DD3300 will provide an impressive ROI along with dramatic cost savings, bringing greater scalability and significant reduction in WAN bandwidth use for backups and recovery.1Modernize Your Data Protection With Simple Extension to the CloudTo enable smaller IT environments to simply extend to the cloud for long-term retention, DD3300 supports Data Domain Cloud Tier. With DD3300, organizations can natively-tier deduplicated data to the cloud for long-term retention without the need for a separate cloud gateway or a virtual appliance. This new compact Data Domain delivers cost-effective long-term retention on a wide cloud ecosystem including Dell EMC Elastic Cloud Storage (ECS) and major public clouds.DD3300 also supports Data Domain Cloud Disaster Recovery service. In conjunction with Dell EMC Data Protection Software, the Data Domain Cloud Disaster Recovery enables virtual machine images protected on Data Domain DD3300 to be copied to object storage within the public cloud for a modern, cost efficient disaster recovery solution that takes advantage of the cloud. Protect Data Wherever It Resides With Multi-Site ScalabilityWith elegant, multi-site scalability with the included capabilities of Data Domain Replicator, DD3300 provides fast, network-efficient and encrypted replication from remote offices to the central data center. It only transfers deduplicated data across the network, eliminating up to 98% of the bandwidth required.2 With its compact design, DD3300 is ideal for multi-site scenarios with remote and/or branch IT environments that house data separate from the central IT environment.Get Even More With Dell EMC Data Protection SoftwareTo maximize the return on investment and get the most out of Dell EMC’s advanced deduplication, Data Domain DD3300 can be paired with our modern Data Protection software. With DD3300 and Dell EMC Data Protection software, customers can amplify the logical capacity and cloud capabilities, benefit from an intuitive user-friendly interface for simpler management, and take advantage of advanced VMware automation and integration capabilities.Whether securing data of a small to mid-sized business or a department of a Fortune 100 company, Dell EMC Data Domain DD3300 provides dependable protection.Dell EMC Data Domain 3300 is generally available through Dell EMC and channel partners. To learn more about Dell EMC Data Domain DD3300, view the product details on the Dell EMC Store and follow @DellEMCProtect on Twitter for the latest announcements, customer case studies and topical content._______________________________________________________________________________________1 Based on Dell EMC internal analysis of >15,000 Data Domain systems deployed worldwide, November 20172 Based on Dell EMC internal analysis, July 2017
Digital transformation is every leader’s concern. However, when I meet with a customer’s management teams, each member looks at this transformation from a different angle. There seems to be a recurring sentiment that would best be summarized as: The CEO is the CEO and the CIO is the CIO, and never the two shall meet. This stems from the reality that for a long time now there has been a huge gap between the office of the CEO and the IT department. In fact, this gap is not only metaphorical, but literal with many still two organizational levels apart, with a CFO or a COO standing between them. Fortunately digital transformation is now driving the CEO and CIO closer together – and for the better of the company.So are the CEO and the CIO completely different species? In short, yes! But perhaps we should not see the differences as troublesome, but as complementary, very much like baby-boomers differ radically from Generation X. Yet these generations work side by side, complementing each other’s strengths and wiping out each other’s weak points. Both the CEO and the CIO stand to benefit from working closely together in the same way.Going DigitalAs we know from Gartner’s 2017 CEO Survey, profit and revenue growth top the CEO’s list of priorities, but they are closely followed by IT related focus areas, with 47% experiencing pressure from the board to make progress in digital. And this is exactly where a CIO can help. Digital projects are the most likely area to support product renewal and innovation. As we heard from Fresenius CEO Pascale Wirtz recently in a series of CEO-interviews we conducted: “what the CEO needs to do is to find a way to have digital completely embedded in one, two or three business initiatives.” What better person to turn to than the CIO to ensure that new digital initiatives can see the light of day, since they know best what benefits new technologies can bring to the business. There’s a huge potential out there for the CIO to guide the CEO, or as Gartner’s CEO Survey puts it: “CIOs must see it as their job to gently educate their CEOs and expand their horizons.” Digital is far more than automating paper-based, back-office processes or setting up an e-business division, it’s transforming business models, increasing productivity, delivering better customer service, and ultimately takes you closer to the number one priority – increased profit and revenue.Zoning to WinIn IT departments it has become common practice to work bi-modal or dual-speed: while back-office IT should be as stable as possible, areas of innovation require more risk-taking and experimentation. The CIO can definitely help the CEO in setting priorities and deciding where to build a fence between traditional business-as-usual and innovation. This bi-modal IT theory is reflected in business guru Geoffrey Moore’s the ‘Zone to Win’ strategy, which is getting lots of attention right now. In this strategy, a company divides itself into different zones that work towards different results and metrics. After all, digital transformation is very much like rebuilding a plane while it is in the air – you need to remember to keep the plane flying!Getting the Numbers RightAt the end of the day, the CEO is focused on one thing – the bottom line. So making the case for an innovation project has to ultimately be a means to that (future) end, and tracking the progress of that project is another area where the CIO can prove their worth to the CEO. As seasoned business leaders, we know how difficult it sometimes is to get the right numbers, keeping in mind the relative performance of different departments. And that’s for business-as-usual metrics! When it comes to digital initiatives with new goalposts, it gets even more challenging to measure success. Together with the CFO, the CIO is best placed to first define the right metrics and then provide the numbers at the end of every month, demonstrating the progress being made and ultimately how this initiative is driving business transformation and growth.Changing the Skill SetOver the years, the role of the CIO and the composition of the IT department has changed enormously. As IT has become more strategic, the CIO morphed from ‘the IT person’ to a key player in the C-suite. And the rest of the IT staff have followed this path towards change, becoming business relationship managers rather than ‘just’ tech gurus. The rest of the organization will need to undergo a similar kind of change, acquiring new skill sets, creating diverse teams and encouraging an openness towards learning more about other parts of the company. Both the CEO and CIO will need to embrace new leadership methods and strengthen their interpersonal skills. This is an area where the CEO and the CIO can learn from each other’s experiences to ensure that talent is developed within the organization. After all, with great talent, comes great performance.Being a Trusted AdvisorIn the fourth industrial revolution that we are living now, the will to win is stronger than ever, especially for the CEO. In the digital era, the CEO will be the explorer, charting out the strategy to discover new lands. The CIO will be his or her most likely choice for navigator, being best placed to steer towards game-changing tactics that will turn the strategy into a winning one. After all, the CIO’s role is not simply to follow orders, but to advise the CEO, helping him or her to see not only the waters they are in, but the massive opportunities on the horizon.
LOS ANGELES (AP) — With his final two performances, the late Chadwick Boseman has earned two NAACP Image Awards nominations. Boseman scored nods Tuesday for his work in the Netflix films “Da 5 Bloods” and “Ma Rainey’s Black Bottom.” The actor died last year after privately battling colon cancer. “Ma Rainey’s Black Bottom” came away with nine nominations. It delves into the story of blues singer Ma Rainey during a turbulent recording session at a Chicago music studio in 1927. Netflix emerged with a leading 48 nominations. The awards honoring entertainers and writers of color will air on CBS on March 27.