Server mode is submitted on the server
You first need to place the Hadoop configuration file on the server under SRC
* Server mode: submit <br> * a, package the MR program (JAR), send to the server * b, through: Hadoop jar jar path class fully qualified nameCopy the code
The Mapper class:
package com.chb; import java.io.IOException; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.util.StringUtils; /** ** */ public class WCMapper extends Mapper<LongWritable, Text, Text, IntWritable> {/** * If a line is not read from the split of the file and called once, * shuffled and grouped the output of value * * mapper, which is in the following table key, and output to reducer * in this example, / @override protected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { String[] words = StringUtils.split(value.toString(), ' '); for (String w : words) { context.write(new Text(w), new IntWritable(1)); }}}Copy the code
Reducer class:
package com.chb; import java.io.IOException; import java.util.Iterator; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Reducer; Public class WCReducer extends Reducer<Text, IntWritable, Text, IntWritable>{/** * mapper This method is called once per group * */ protected void reduce(Text key, Iterable<IntWritable> vals, Context context) throws IOException, InterruptedException { int sum = 0; for (IntWritable iw : vals) { sum += iw.get(); } context.write(key, new IntWritable(sum)); }}Copy the code
Perform class
package com.chb; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; Public class WC {public static void main(String[] args) {// Load Configuration conf = new Configuration(); try { FileSystem fs = FileSystem.get(conf); Job Job = job.getInstance (); job.setJarByClass(WC.class); job.setJobName("WC"); Mapper and Reducer job. SetMapperClass (wcmapper.class); job.setReducerClass(WCReducer.class); // The output type of Mapper is job.setMapOutputKeyClass(text.class); job.setMapOutputValueClass(IntWritable.class); / / input/output Path FileInputFormat. AddInputPath (job, new Path ("/user/CHB/input/")); Path out = new Path("/user/chb/output/wc"); if(fs.exists(out)) { fs.delete(out, true); } FileOutputFormat.setOutputPath(job, out); boolean f = job.waitForCompletion(true); If (f){system.out.println (" Task completed "); } } catch (Exception e) { e.printStackTrace(); }}}Copy the code
The connection is configured
Permission problem occurs when uploading files to HDFS through Eclipse:
Solution: Unresolved
= = = = = = = = = = = = = = = = = = = = = = = =
Run the following command to upload a local file to the HDFS:
hadoop fs -put wc.txt /usr/chb/input/
Copy the code
Export source files into JAR packages:
Execute a program
Be sure to include the package name