Skip to content

Test discovery failed with pyspark #8311

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Timbonz49 opened this issue Oct 31, 2019 · 1 comment
Closed

Test discovery failed with pyspark #8311

Timbonz49 opened this issue Oct 31, 2019 · 1 comment
Labels
bug Issue identified by VS Code Team member as probable bug

Comments

@Timbonz49
Copy link

Environment data

  • VS Code version: 1.40.0-insider (system setup)
    Commit: 33cc09d
    Date: 2019-10-28T11:50:41.596Z
    Electron: 6.1.2
    Chrome: 76.0.3809.146
    Node.js: 12.4.0
    V8: 7.6.303.31-electron.0
    OS: Windows_NT x64 10.0.16299

Name: Python
Id: ms-python.python
Description: Linting, Debugging (multi-threaded, remote), Intellisense, code formatting, refactoring, unit tests, snippets, and more.
Version: 2019.11.45343-dev
Publisher: Microsoft
VS Marketplace Link: https://marketplace.visualstudio.com/items?itemName=ms-python.python

  • Extension version (available under the Extensions sidebar): XXX
  • OS and version: WSL Ubuntu
  • Python version (& distribution if applicable, e.g. Anaconda): Python 3.6.8
  • Type of virtual environment used: venv
  • Relevant/affected Python packages and their versions: XXX
  • Jedi or Language Server? (i.e. what is "python.jediEnabled" set to; more info How to update the language server to the latest stable version #3977): Jedi enabled

Expected behaviour

Discover works

Actual behaviour

python /home/tim/.vscode-server-insiders/extensions/ms-python.python-2019.11.45343-dev/pythonFiles/testing_tools/run_adapter.py discover pytest -- --rootdir /mnt/c/projects/data_platform -s --cache-clear .
Test Discovery failed:
Error: 19/10/31 01:58:05 WARN Utils: Your hostname, T422B-L103550 resolves to a loopback address: 127.0.1.1; using 192.168.123.134 instead (on interface eth3)
19/10/31 01:58:05 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/10/31 01:58:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

We have modules that are using python modules with the following imports

from pyspark.sql import SparkSession
import pyspark.sql

I think the issue is related to the stderr output, in bold above. If I add "2> /dev/null" to the command I get the list/dict/json output expected

@Timbonz49 Timbonz49 added triage-needed Needs assignment to the proper sub-team bug Issue identified by VS Code Team member as probable bug labels Oct 31, 2019
@brettcannon
Copy link
Member

Duplicate of #6594

@brettcannon brettcannon marked this as a duplicate of #6594 Oct 31, 2019
@ghost ghost removed the triage-needed Needs assignment to the proper sub-team label Oct 31, 2019
@lock lock bot locked as resolved and limited conversation to collaborators Nov 7, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Issue identified by VS Code Team member as probable bug
Projects
None yet
Development

No branches or pull requests

2 participants