Apply filters on dataframe in spark

How to put filters on dataframe using spark

Input:

id,name,marks

1,Shiva,90

2,Ram,85

3,Mohan,95

4,Raju,96


Code:

Let suppose you have a dataframe with name df.

result_df = df.filter(df['name'] == 'Shiva')

result_df.show()

or

from pyspark.sql.functions import col

result_df = df.filter(col('name') == 'Shiva')

result_df.show()










Comments