热线电话:13121318867

登录
2019-02-27 阅读量: 726
Spark Scala:获取数据帧行中非零列的计数

我有一个场景,我已经连续计算每列中的非零数。

DataFrame:

subaccid|srp0|srp1|srp2|srp3|srp4|srp5|srp6|srp7|srp8|srp9|srp10|srp11|srp12

+-------+----+----+----+----+----+----+------+----+----+----+-----+-----+--+

AAA |0.0 |12.0|12.0|0.0 |0.0 |0.0 |10.0 |0.0 |0.0 |0.0 |0.0 |0.0 |0.0

AAB |12.0|12.0|12.0|10.0|12.0|12.0|12.0 |0.0 |0.0 |0.0 |0.0 |0.0 |0.0

AAC |10.0|12.0|0.0 |0.0 |0.0 |10.0|10.0 |0.0 |0.0 |0.0 |0.0 |0.0 |0.0

ZZZ |0.0 |0.0 |0.0 |0.0 |0.0 |0.0 |-110.0|0.0 |0.0 |0.0 |0.0 |0.0 |0.0

+-------+----+----+----+----+----+----+------+----+----+----+-----+-----+--+

输出:

subaccid,count of nonzeros

AAA,2

AAB,7

AAC,4

ZZZ,1

解决办法:这也是有效的,没有RDD的东西,我自己的数据:

import org.apache.spark.sql.functions._

import spark.implicits._

val df = sc.parallelize(Seq(

("r1", 0.0, 0.0, 0.0, 0.0),

("r2", 6.4, 4.9, 6.3, 7.1),

("r3", 4.2, 0.0, 7.2, 8.4),

("r4", 1.0, 2.0, 0.0, 0.0)

)).toDF("ID", "a", "b", "c", "d")

val count_non_zero = df.columns.tail.map(x => when(col(x) === 0.0, 1).otherwise(0)).reduce(_ + _)

df.withColumn("non_zero_count", count_non_zero).show(false)

收益:

+---+---+---+---+---+--------------+

|ID |a |b |c |d |non_zero_count|

+---+---+---+---+---+--------------+

|r1 |0.0|0.0|0.0|0.0|4 |

|r2 |6.4|4.9|6.3|7.1|0 |

|r3 |4.2|0.0|7.2|8.4|1 |

|r4 |1.0|2.0|0.0|0.0|2 |

+---+---+---+---+---+--------------+

假设双/实格式,我们进入任何问题asInstanceOf。

8.9856
1
关注作者
收藏
评论(0)

发表评论

暂无数据
推荐帖子