Amazon Elastic MapReduce
Based on Hadoop, MapReduce equips users with potent distributed data-processing tools
- Doesn't take long to get the hang of
- Currently available in the US region only
You'll want to be familiar with the Apache Hadoop framework before you jump into Elastic MapReduce. It doesn't take long to get the hang of it, though. Most developers can have a MapReduce application running within a few hours.
These two steps, the map function and the reduce function, comprise what Amazon MapReduce refers to as a "job flow." Admittedly, this is an oversimplification, because job flows involve other configuration parameters (such as where you get the input data and where you put the output), and you can define additional steps in the process, but that's the basic idea.
As a result, a programmer building a Hadoop-powered MapReduce system can focus on the comparatively simple job of crafting the individual functions that process single key/value pairs at a time. Hadoop does the legwork of carving the input data into initial key/value pairs; starting multiple map function instances; feeding them input data; gathering, sorting, and ordering the intermediate key/value pairs; launching reduce instances; feeding them the properly arranged intermediate data; and -- finally -- delivering the output. And all the while, Hadoop monitors the progress map and reduce tasks, as well as restarts "dead" ones automatically. Whuf.
Hadoop in the cloud
To access Amazon's Elastic MapReduce, your first stop is your Amazon Web Services account page (assuming you have an account with AWS), where you must sign up for the Elastic MapReduce service. Then, head on over to the AWS Management Console and log in. You'll find that the AWS Console -- which had been a control panel for Amazon's EC2 only -- displays a new Amazon Elastic MapReduce tab. Click the tab, and you are transferred to the Job Flows page, from which you can monitor the status of current job flows, as well as examine details of previous (terminated) job flows.
To define a new job flow, click the Create New Job Flow button. This sends you through a series of windows in step-by-step fashion. You fill in textboxes to define the location of your input data, where you want your output data, and paths to your map and reduce function. All of these locations must exist in Amazon S3 buckets. In the case of the output data, the location will exist when the job flow concludes. Consequently, it's a good idea to have a utility for transferring data to and from S3 on hand. I recommend the excellent S3Fox Organizer.
Amazon Elastic MapReduce allows for two kinds of job flows: custom jar and streaming. A custom jar-style job flow expects your map and reduce functions to be in compiled Java classes stored in Java JAR files. The Hadoop framework is Java-based, so a custom jar job flow provides the better performance. On the other hand, a streaming-type job flow lets you write your map and reduce functions in non-Java languages such as Python, Ruby, Perl, and others. The functions of a streaming job flow read the input data from stdin, and send the output to stdout. So, data flows in and out of the functions as strings, and -- by convention -- a tab separates the key and value of each input line. Once you've specified the whereabouts of your job flow's components, you identify the quantity and processing power of the EC2 instances on which the job will execute. You can select up to 20 EC2 instances; any more than that, and you have to fill out a special request form. Your choice of compute instances ranges from Small to Extra Large High CPU. Check the Amazon documentation for a complete description of the power of a CPU instance.
Join the newsletter!
Most Popular Reviews
- 1 Samsung Galaxy Note 8: Full, in-depth review
- 2 Huawei Y5 (2017): Full, in depth review
- 3 LG G6 Plus: Full, in-depth review
- 4 First Look: Nikon D850
- 5 OnePlus 5: Full, in-depth review
Latest News Articles
- Apple & GE Announce IoT Partnership
- Optimizely expands its enterprise-focused offerings
- Intel launches AI-driven anti-money laundering solution
- Square expands Australian offering with Square Stand
- Seagate joins Bain bid to take control of Toshiba Memory
PCW Evaluation Team
Brainstorming, innovation, problem solving, and negotiation have all become much more productive and valuable if people can easily collaborate in real time with minimal friction.
The print quality also does not disappoint, it’s clear, bold, doesn’t smudge and the text is perfectly sized.
The Huddle Board’s built in program; Sharp Touch Viewing software allows us to easily manipulate and edit our documents (jpegs and PDFs) all at the same time on the dashboard.
The biggest perks for me would be that it comes with easy to use and comprehensive programs that make the collaboration process a whole lot more intuitive and organic
I rate the printer as a 5 out of 5 stars as it has been able to fit seamlessly into my busy and mobile lifestyle.
It’s perfect for mobile workers. Just take it out — it’s small enough to sit anywhere — turn it on, load a sheet of paper, and start printing.
- Jabra Elite Sport (2017) review
- How to download the Windows 10 Fall Creators Update right now
- Opinon: Life after KRACK
- What's the difference between an Intel Core i3, i5 and i7?
- Laser vs. inkjet printers: which is better?
Product Launch Showcase
- TPSales Representative/Account Manager - Global CorporationNSW
- CCWin10 Packaging & DeploymentWA
- CCSAP HR / Payroll AnalystACT
- FTData AnalystOther
- FTSoftware Developer - Banking SolutionsOther
- FTSenior Salesforce DeveloperOther
- TPAgile Coach & Jira + Confluence AdministrationNSW
- CCTraining and Support LeadNSW
- CCMobile DeveloperNSW
- FTService ManagerOther
- FTLead Solution ArchitectSA
- CCSenior Business AnalystNSW
- FTSystems Engineer - Infrastructure/Security ProjectsNSW
- TPFront End DeveloperWA
- CCBusiness Analyst - Multiple RolesACT
- TPStrategic Procurement OfficerQLD
- CCTraining and Support LeadACT
- FTCloud Engineer - AWSOther
- TPBusiness Analyst | Utilities | Multiple PositionsQLD
- FTAccount ManagerACT
- TPProcess AnalystVIC
- CCSolutions ArchitectNSW
- TPSenior Workbrain Functional ConsultantQLD
- FTSystems AnalystsOther
- CCTechnical Writer - Based in BrisbaneNSW