You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
python /home/tim/.vscode-server-insiders/extensions/ms-python.python-2019.11.45343-dev/pythonFiles/testing_tools/run_adapter.py discover pytest -- --rootdir /mnt/c/projects/data_platform -s --cache-clear .
Test Discovery failed:
Error: 19/10/31 01:58:05 WARN Utils: Your hostname, T422B-L103550 resolves to a loopback address: 127.0.1.1; using 192.168.123.134 instead (on interface eth3)
19/10/31 01:58:05 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/10/31 01:58:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
We have modules that are using python modules with the following imports
from pyspark.sql import SparkSession
import pyspark.sql
I think the issue is related to the stderr output, in bold above. If I add "2> /dev/null" to the command I get the list/dict/json output expected
The text was updated successfully, but these errors were encountered:
Environment data
Commit: 33cc09d
Date: 2019-10-28T11:50:41.596Z
Electron: 6.1.2
Chrome: 76.0.3809.146
Node.js: 12.4.0
V8: 7.6.303.31-electron.0
OS: Windows_NT x64 10.0.16299
Name: Python
Id: ms-python.python
Description: Linting, Debugging (multi-threaded, remote), Intellisense, code formatting, refactoring, unit tests, snippets, and more.
Version: 2019.11.45343-dev
Publisher: Microsoft
VS Marketplace Link: https://marketplace.visualstudio.com/items?itemName=ms-python.python
"python.jediEnabled"
set to; more info How to update the language server to the latest stable version #3977): Jedi enabledExpected behaviour
Discover works
Actual behaviour
python /home/tim/.vscode-server-insiders/extensions/ms-python.python-2019.11.45343-dev/pythonFiles/testing_tools/run_adapter.py discover pytest -- --rootdir /mnt/c/projects/data_platform -s --cache-clear .
Test Discovery failed:
Error: 19/10/31 01:58:05 WARN Utils: Your hostname, T422B-L103550 resolves to a loopback address: 127.0.1.1; using 192.168.123.134 instead (on interface eth3)
19/10/31 01:58:05 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/10/31 01:58:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
We have modules that are using python modules with the following imports
from pyspark.sql import SparkSession
import pyspark.sql
I think the issue is related to the stderr output, in bold above. If I add "2> /dev/null" to the command I get the list/dict/json output expected
The text was updated successfully, but these errors were encountered: