将数据导入hive,通过SparkSql编写sql,实现不同业务的数据计算实现,主要讲述车辆卡口转换率,卡口转化率:主要计算不同卡口下车辆之间的流向,求出之间的转换率。
1、查出每个地区下每个路段下的车流量
selectcar,monitor_id,action_time,ROW_NUMBER () OVER (PARTITION by carORDER byaction_time) as n1FROMtraffic.hive_flow_action
此结果做为表1,方便后面错位连接使用
2、通过错位连接获取每辆车的行车记录
通过表1的结果,与自身进行错位链接,并以车牌为分区,拼接经过卡口的过程
(selectt1.car,t1.monitor_id,concat(t1.monitor_id,"->",t2.monitor_id) as wayfrom(selectcar,monitor_id,action_time,ROW_NUMBER () OVER (PARTITION by carORDER byaction_time) as n1FROMtraffic.hive_flow_action) t1left join (selectcar,monitor_id,action_time,ROW_NUMBER () OVER (PARTITION by carORDER byaction_time) as n1FROMtraffic.hive_flow_action) t2 ont1.car = t2.carand t1.n1 = t2.n1-1wheret2.action_time is not null)
获取到每辆车的一个行车记录,经过的卡口
3、获取行车过程中的车辆数
获取卡口1~卡口2,…等的车辆数有哪些,即拿上面的行车记录字段进行分区在进行统计
(selects1.way,COUNT(1) sumCarfrom--行车过程(selectt1.car,t1.monitor_id,concat(t1.monitor_id,"->",t2.monitor_id) as wayfrom(selectcar,monitor_id,action_time,ROW_NUMBER () OVER (PARTITION by carORDER byaction_time) as n1FROMtraffic.hive_flow_action) t1left join (selectcar,monitor_id,action_time,ROW_NUMBER () OVER (PARTITION by carORDER byaction_time) as n1FROMtraffic.hive_flow_action) t2 ont1.car = t2.carand t1.n1 = t2.n1-1wheret2.action_time is not null)s1group by way)
4、获取每个卡口的总车辆数
获取每个卡口最初的车辆数,方便后面拿行车轨迹车辆数/总车辆数,得出卡口之间的转换率
selectmonitor_id ,COUNT(1) sumallfromtraffic.hive_flow_actiongroup bymonitor_id
5、求出卡口之间的转换率
selects2.way,s2.sumCar / s3.sumall zhlfrom(selects1.way,COUNT(1) sumCarfrom--行车过程(selectt1.car,t1.monitor_id,concat(t1.monitor_id,"->",t2.monitor_id) as wayfrom(selectcar,monitor_id,action_time,ROW_NUMBER () OVER (PARTITION by carORDER byaction_time) as n1FROMtraffic.hive_flow_action) t1left join (selectcar,monitor_id,action_time,ROW_NUMBER () OVER (PARTITION by carORDER byaction_time) as n1FROMtraffic.hive_flow_action) t2 ont1.car = t2.carand t1.n1 = t2.n1-1wheret2.action_time is not null)s1group byway)s2left join--每个卡口总车数(selectmonitor_id ,COUNT(1) sumallfromtraffic.hive_flow_actiongroup bymonitor_id) s3 onsplit(s2.way,"->")[0]= s3.monitor_id