Welcome to Software Development on Codidact!
Will you help us build our independent community of developers helping developers? We're small and trying to grow. We welcome questions about all aspects of software development, from design to code to QA and more. Got questions? Got answers? Got code you'd like someone to review? Please join us.
Post History
As of version 2.2, installing pyspark will also install spark, so you can run pip install pyspark or equivalent and use it in your normal python environment. The pyspark getting started docs also ...
Answer
#1: Initial revision
[As of version 2.2](https://issues.apache.org/jira/browse/SPARK-18267), installing pyspark will also install spark, so you can run `pip install pyspark` or equivalent and use it in your normal python environment. The [pyspark getting started docs](https://spark.apache.org/docs/latest/api/python/getting_started/index.html) also include some links to live notebooks, however as of writing they seem to be unavailable.