Home » Apache Spark Filter Function

Apache Spark Filter Function

by Online Tutorials Library

Spark Filter Function

In Spark, the Filter function returns a new dataset formed by selecting those elements of the source on which the function returns true. So, it retrieves only the elements that satisfy the given condition.

Example of Filter function

In this example, we filter the given data and retrieve all the values except 35.

  • To open the spark in Scala mode, follow the below command.

Spark Filter Function

  • Create an RDD using parallelized collection.
  • Now, we can read the generated result by using the following command.

Spark Filter Function

  • Apply filter function and pass the expression required to perform.
  • Now, we can read the generated result by using the following command.

Spark Filter Function

Here, we got the desired output.


You may also like