欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页  >  IT编程

CDH5.13快速体验

程序员文章站 2022-07-04 17:47:09
相对于易用性很差Apache Hadoop,其他商业版Hadoop的性能易用性都有更好的表现,如Cloudera、Hortonworks、MapR以及国产的星环,下面使用CDH(Cloudera Distribution Hadoop)快速体验下。首先从,从Cloudera官网下载部署好的虚拟机环境... ......

相对于易用性很差apache hadoop,其他商业版hadoop的性能易用性都有更好的表现,如cloudera、hortonworks、mapr以及国产的星环,下面使用cdh(cloudera distribution hadoop)快速体验下。

首先从,从cloudera官网下载部署好的虚拟机环境,解压后用虚拟机打开,官方推荐至少8g内存2cpu,由于笔记本性能足够,我改为8g内存8cpu启动,虚拟机各种账号密码都是cloudera

打开虚拟机的浏览器访问

CDH5.13快速体验点击get started以体验

CDH5.13快速体验CDH5.13快速体验CDH5.13快速体验

tutorial exercise 1:导入、查询关系数据

利用sqoop工具将mysql数据导入hdfs中


[cloudera@quickstart ~]$ sqoop import-all-tables \
>     -m 1 \
>     --connect jdbc:mysql://quickstart:3306/retail_db \
>     --username=retail_dba \
>     --password=cloudera \
>     --compression-codec=snappy \
>     --as-parquetfile \
>     --warehouse-dir=/user/hive/warehouse \
>     --hive-import
warning: /usr/lib/sqoop/../accumulo does not exist! accumulo imports will fail.
please set $accumulo_home to the root of your accumulo installation.
19/04/29 18:31:46 info sqoop.sqoop: running sqoop version: 1.4.6-cdh5.13.0
19/04/29 18:31:46 warn tool.basesqooptool: setting your password on the command-line is insecure. consider using -p instead.
19/04/29 18:31:46 info tool.basesqooptool: using hive-specific delimiters for output. you can override
19/04/29 18:31:46 info tool.basesqooptool: delimiters with --fields-terminated-by, etc.
19/04/29 18:31:46 warn tool.basesqooptool: it seems that you're doing hive import directly into default

(many more lines suppressed)

                failed shuffles=0
                merged map outputs=0
                gc time elapsed (ms)=87
                cpu time spent (ms)=3690
                physical memory (bytes) snapshot=443174912
                virtual memory (bytes) snapshot=1616969728
                total committed heap usage (bytes)=352845824
        file input format counters 
                bytes read=0
        file output format counters 
                bytes written=0
19/04/29 18:38:27 info mapreduce.importjobbase: transferred 46.1328 kb in 85.1717 seconds (554.6442 bytes/sec)
19/04/29 18:38:27 info mapreduce.importjobbase: retrieved 1345 records.
[cloudera@quickstart ~]$ hadoop fs -ls /user/hive/warehouse/
found 6 items
drwxrwxrwx   - cloudera supergroup          0 2019-04-29 18:32 /user/hive/warehouse/categories
drwxrwxrwx   - cloudera supergroup          0 2019-04-29 18:33 /user/hive/warehouse/customers
drwxrwxrwx   - cloudera supergroup          0 2019-04-29 18:34 /user/hive/warehouse/departments
drwxrwxrwx   - cloudera supergroup          0 2019-04-29 18:35 /user/hive/warehouse/order_items
drwxrwxrwx   - cloudera supergroup          0 2019-04-29 18:36 /user/hive/warehouse/orders
drwxrwxrwx   - cloudera supergroup          0 2019-04-29 18:38 /user/hive/warehouse/products
[cloudera@quickstart ~]$ hadoop fs -ls /user/hive/warehouse/categories/
found 3 items
drwxr-xr-x   - cloudera supergroup          0 2019-04-29 18:31 /user/hive/warehouse/categories/.metadata
drwxr-xr-x   - cloudera supergroup          0 2019-04-29 18:32 /user/hive/warehouse/categories/.signals
-rw-r--r--   1 cloudera supergroup       1957 2019-04-29 18:32 /user/hive/warehouse/categories/6e701a22-4f74-4623-abd1-965077105fd3.parquet
[cloudera@quickstart ~]$ 

然后访问http://quickstart.cloudera:8888/,来访问表(invalidate metadata;是用来刷新元数据的)

CDH5.13快速体验

tutorial exercise 2 :外部表方式导入访问日志数据到hdfs并查询

通过hive建表

create external table intermediate_access_logs (
    ip string,
    date string,
    method string,
    url string,
    http_version string,
    code1 string,
    code2 string,
    dash string,
    user_agent string)
row format serde 'org.apache.hadoop.hive.contrib.serde2.regexserde'
with serdeproperties (
    'input.regex' = '([^ ]*) - - \\[([^\\]]*)\\] "([^\ ]*) ([^\ ]*) ([^\ ]*)" (\\d*) (\\d*) "([^"]*)" "([^"]*)"',
    'output.format.string' = "%1$$s %2$$s %3$$s %4$$s %5$$s %6$$s %7$$s %8$$s %9$$s")
location '/user/hive/warehouse/original_access_logs';

create external table tokenized_access_logs (
    ip string,
    date string,
    method string,
    url string,
    http_version string,
    code1 string,
    code2 string,
    dash string,
    user_agent string)
row format delimited fields terminated by ','
location '/user/hive/warehouse/tokenized_access_logs';

add jar /usr/lib/hive/lib/hive-contrib.jar;

insert overwrite table tokenized_access_logs select * from intermediate_access_logs;

CDH5.13快速体验impala中刷新元数据后访问表

CDH5.13快速体验

tutorial exercise 3:使用spark进行关联分析

CDH5.13快速体验

tutorial exercise 4:利用flume收集日志,并用solr做全文索引

CDH5.13快速体验

tutorial exercise 5:可视化

CDH5.13快速体验tutorial is over!