bunnygirl [she/her]

bunny-vibe

  • 3 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: November 14th, 2023

help-circle








  • Sorry for the late response,

    I can’t help for PySpark specifically cause I have no experience with it. In general tho you’ll have to install the tooling you need to compile/run the program in WSL, I haven’t used Spark in years so I don’t know specifics but you’ll want to have at least Java and Python installed here. On Ubuntu, you’ll want the packages default-jdk, python3, python3-pip, python3-venv (if you’re using venv), as well as python-is-python3 for convenience. If you’re using venv, you might want to rerun python -m venv env again to make sure it has the files Bash needs, then do source env/bin/activate to activate the venv. You might also have to install pyspark from the Bash shell in case it needs to build anything platform specific. You can set environment variables in ~/.bashrc (It’s the home dir in the Linux VM, not Windows so use the terminal to change this e.g. nano ~/.bashrc or vim ~/.bashrc if you’re familiar with vi) with the shape export VARIABLE=VALUE (put quotes around VALUE if it has spaces etc), then start a new shell to load those (do exec bash to replace the currently running shell with a new process)

    From there you should be able to just run the code normally but in WSL instead