ConsoleFlare
  • Python
    • Python Installation
    • Pandas and SQL
  • Projects
    • Data Analytics Project
      • Courier Analytics Challenge
      • Solution
    • Skytrax Airline Review Analysis Pipeline
      • Setting up Azure SQL Database
      • SkyTrax Web Scraping
  • Reporting
    • Power BI
      • Installation
      • Data Sources
      • Important Links
  • PySpark & Databricks
    • Spark vs Hadoop
    • Cluster Computing
    • PySpark
    • Databricks Introduction
    • PySpark in Databricks
    • Reading Data with PySpark
    • PySpark Transformation Methods
    • Handling Duplicate Data
    • PySpark Action Methods
    • PySpark Native Functions
    • Partitioning
    • Bucketing
    • Partitioning vs Bucketing
  • Live Data Streaming
    • Spark Streaming
      • Installation Issues
      • Jupyter Notebook Setup
  • Data Pipeline
    • Azure Data Factory
  • Blockchain
    • Smart Contract Guide
      • Setting up a Node project
      • Developing smart contracts
  • Interview Questions
    • SQL Interview Questions
    • Power BI Interview Questions
  • T-SQL Exercises
    • Exercise 0
    • Exercise 1
    • Exercise 2
    • Exercise 3
  • CHEAT SHEET
    • Ultimate SQL Server Cheat Sheet
Powered by GitBook
On this page

Was this helpful?

  1. Live Data Streaming
  2. Spark Streaming

Installation Issues

This page addresses some common errors and issues that you might have while installing Spark on your system.

PreviousSpark StreamingNextJupyter Notebook Setup

Last updated 3 years ago

Was this helpful?

Common issues/errors:

If you are getting the following error while installing Spark 3.2 try installing Spark 3.0.3

Alternate steps to resolve the error:

OPTION 1

Open a spark cluster manually using the command:

spark-class org.apache.spark.deploy.master.Master

Your output should be something like the image below:

This means that your UI is set at localhost:8080 and you have opened the master at localhost:7077.

So now the only thing that's left to do is open a 2nd cmd and execute the command:

spark-shell --master spark://localhost:7077

Your output should be something like the image below:

OPTION 2

Install an older spark version preferably Spark 3.0.3