In this PySpark tutorial, we will discuss how to use select() method to display particular columns in PySpark DataFrame.
Introduction:
DataFrame in PySpark is an two dimensional data structure that will store data in two dimensional format. One dimension refers to a row and second dimension refers to a column, So It will store the data in rows and columns.
Let's install pyspark module before going to this. The command to install any module in python is "pip".
Syntax:
pip install module_name
Installing PySpark:
pip install pyspark
Steps to create dataframe in PySpark:
1. Import the below modules
import pyspark
from pyspark.sql import SparkSession
2. Create spark app named tutorialsinhand using getOrCreate() method
Syntax:
spark = SparkSession.builder.appName('tutorialsinhand').getOrCreate()
3. Create list of values for dataframe
4. Pass this list to createDataFrame() method to create pyspark dataframe
Syntax:
spark.createDataFrame(list of values)
Let's create PySpark DataFrame with 5 rows and 3 columns.
# import the below modules
import pyspark
from pyspark.sql import SparkSession
# create an app
spark = SparkSession.builder.appName('tutorialsinhand').getOrCreate()
#create a list of data
values = [{'rollno': 1, 'student name': 'Gottumukkala Sravan','marks': 98},
{'rollno': 2, 'student name': 'Gottumukkala Bobby','marks': 89},
{'rollno': 3, 'student name': 'Lavu Ojaswi','marks': 90},
{'rollno': 4, 'student name': 'Lavu Gnanesh','marks': 78},
{'rollno': 5, 'student name': 'Chennupati Rohith','marks': 100}]
# create the dataframe from the values
data = spark.createDataFrame(values)
#display dataframe
data.show()
Output:
PySpark DataFrame - output:
+-----+------+-------------------+
|marks|rollno| student name|
+-----+------+-------------------+
| 98| 1|Gottumukkala Sravan|
| 89| 2| Gottumukkala Bobby|
| 90| 3| Lavu Ojaswi|
| 78| 4| Lavu Gnanesh|
| 100| 5| Chennupati Rohith|
+-----+------+-------------------+
Method - 1 : Using column names
Here, we have to specify the column names directly inside select() method.
Syntax:
dataframe.select('column1','column2',...............)
where, column1,column2 are the column names.
Example:
In this example, we will select 'student name' and 'marks' columns
# import the below modules
import pyspark
from pyspark.sql import SparkSession
# create an app
spark = SparkSession.builder.appName('tutorialsinhand').getOrCreate()
#create a list of data
values = [{'rollno': 1, 'student name': 'Gottumukkala Sravan','marks': 98},
{'rollno': 2, 'student name': 'Gottumukkala Bobby','marks': 89},
{'rollno': 3, 'student name': 'Lavu Ojaswi','marks': 90},
{'rollno': 4, 'student name': 'Lavu Gnanesh','marks': 78},
{'rollno': 5, 'student name': 'Chennupati Rohith','marks': 100}]
# create the dataframe from the values
data = spark.createDataFrame(values)
#display 'student name' and 'marks' columns
data.select('student name','marks').show()
Output:
'student name' and 'marks' columns were displayed.
+-------------------+-----+
| student name|marks|
+-------------------+-----+
|Gottumukkala Sravan| 98|
| Gottumukkala Bobby| 89|
| Lavu Ojaswi| 90|
| Lavu Gnanesh| 78|
| Chennupati Rohith| 100|
+-------------------+-----+
Method - 2 : Using column names with DataFrame
Here, we have to specify the column names inside select() method with dataframe name.
Syntax:
dataframe.select(dataframe['column1'],dataframe['column2'],...............)
where, column1,column2 are the column names.
Example:
In this example, we will select 'student name' and 'marks' columns
# import the below modules
import pyspark
from pyspark.sql import SparkSession
# create an app
spark = SparkSession.builder.appName('tutorialsinhand').getOrCreate()
#create a list of data
values = [{'rollno': 1, 'student name': 'Gottumukkala Sravan','marks': 98},
{'rollno': 2, 'student name': 'Gottumukkala Bobby','marks': 89},
{'rollno': 3, 'student name': 'Lavu Ojaswi','marks': 90},
{'rollno': 4, 'student name': 'Lavu Gnanesh','marks': 78},
{'rollno': 5, 'student name': 'Chennupati Rohith','marks': 100}]
# create the dataframe from the values
data = spark.createDataFrame(values)
#display 'student name' and 'marks' columns
data.select(data['student name'],data['marks']).show()
Output:
'student name' and 'marks' columns were displayed.
+-------------------+-----+
| student name|marks|
+-------------------+-----+
|Gottumukkala Sravan| 98|
| Gottumukkala Bobby| 89|
| Lavu Ojaswi| 90|
| Lavu Gnanesh| 78|
| Chennupati Rohith| 100|
+-------------------+-----+
We can also use '.' operator to access columns by specifying dataframe.
But make sure that there are no spaces between strings in column name.
Syntax:
dataframe.select(dataframe.column1,dataframe.column2,...............)
where, column1,column2 are the column names.
Example:
In this example, we will select 'rollno' and 'marks' columns
# import the below modules
import pyspark
from pyspark.sql import SparkSession
# create an app
spark = SparkSession.builder.appName('tutorialsinhand').getOrCreate()
#create a list of data
values = [{'rollno': 1, 'student name': 'Gottumukkala Sravan','marks': 98},
{'rollno': 2, 'student name': 'Gottumukkala Bobby','marks': 89},
{'rollno': 3, 'student name': 'Lavu Ojaswi','marks': 90},
{'rollno': 4, 'student name': 'Lavu Gnanesh','marks': 78},
{'rollno': 5, 'student name': 'Chennupati Rohith','marks': 100}]
# create the dataframe from the values
data = spark.createDataFrame(values)
#display 'rollno' and 'marks' columns
data.select(data.rollno,data.marks).show()
Output:
+------+-----+
|rollno|marks|
+------+-----+
| 1| 98|
| 2| 89|
| 3| 90|
| 4| 78|
| 5| 100|
+------+-----+
Would you like to see your article here on tutorialsinhand.
Join
Write4Us program by tutorialsinhand.com
About the Author
Gottumukkala Sravan Kumar 171FA07058
B.Tech (Hon's) - IT from Vignan's University.
Published 1400+ Technical Articles on Python, R, Swift, Java, C#, LISP, PHP - MySQL and Machine Learning
Page Views :
Published Date :
Jun 12,2023