Skip to content

Create scaling.py and implement WVM model #807

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 15 commits into from
Dec 2, 2019
Merged

Conversation

jranalli
Copy link
Contributor

@jranalli jranalli commented Nov 1, 2019

  • Closes Add Wavelet Variability Model (WVM) for calculating spatial smoothing of irradiance #806
  • I am familiar with the contributing guidelines.
  • Fully tested. Added and/or modified tests to ensure correct behavior for all reasonable inputs. Tests (usually) must pass on the TravisCI and Appveyor testing services.
  • Updates entries to docs/sphinx/source/api.rst for API changes.
  • Adds description and name entries in the appropriate docs/sphinx/source/whatsnew file for all changes.
  • Code quality and style is sufficient. Passes LGTM and SticklerCI checks.
  • New code is fully documented. Includes sphinx/numpydoc compliant docstrings and comments in the code where necessary.
  • Pull request is nearly complete and ready for detailed review.

Brief description of the problem and proposed solution (if not already fully described in the issue linked to above):

I put in just the "discrete" version of distance finding for now, because I'm already looking at quite a bit of code, and the guidelines suggest keeping it short as is reasonable. I have the other distance finding modes complete, but they're an additional 75 lines of code within the main algorithm, and they're likely to prompt a bit of discussion as to best approach.

@cwhanse
Copy link
Member

cwhanse commented Nov 1, 2019

@jranalli thanks! If you are OK with it, I'd like to use this PR as a way for us to explore improving our review process - something we've been discussing but haven't yet tried out.

We should review the API (first priority) and numerical performance. I'd like to communicate who plans to review, so please reply here if you do. @wholmgren @mikofski @adriesse @markcampanelli @whoeverelseiswatchingthis

I will ask the originator of the algorithm to check the porting.

@jranalli
Copy link
Contributor Author

jranalli commented Nov 1, 2019

@cwhanse I'm completely ok with a new process. Any process would be new to me, so it's all the same.

@kevinsa5 oops you're right. I'll straighten that out.

@wholmgren
Copy link
Member

Thanks @jranalli for the submission and thanks @cwhanse for coordinating. I'd use this code and I am interested in reviewing this PR. I skimmed the code and it looks like a great start. I agree that this scope is appropriate and other methods would be best implemented in future PRs.

@adriesse
Copy link
Member

adriesse commented Nov 3, 2019

@cwhanse Thanks for the invitation, but I need a breather; will try to resist temptations and sit this one out.

@cwhanse
Copy link
Member

cwhanse commented Nov 4, 2019

@jranalli I pushed a merge to your branch to fix the conflict in the whatsnew file (the conflict was nothing you did, it happens sometimes). I should have notified you first, my apologies. Just pull your branch back from GitHub to update your local files and all should be fine.

The test failures are also nothing you did - we have some functions in pvlib that fetch data from servers, and when the servers don't respond during tests, the test fail. Often, these issues are resolved by restarting the build from the test report page.

Copy link
Member

@cwhanse cwhanse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wholmgren let me know he'd be a little longer before he could do a review for the API. Here's my comments on the API and function organization to help move the review process ahead.

pvlib/scaling.py Outdated

Parameters
----------
cs_series : numeric or pandas.Series
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Prefer clearsky_index for consistency with clearsky_index function

pvlib/scaling.py Outdated
The density of installed PV in W/m^2. Must be specified for 'square'
method, ignored otherwise. Default of 41 W/m^2 is 1MW per 6 acres.

dt : numeric
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

float, default None Need to mention units (seconds) and add [s] at the end of the description.

Let's move dt up in the argument list ahead of capacity and density

pvlib/scaling.py Outdated
The type of plant distribution to model.
Options are ``'discrete', 'square', 'polygon'``.

capacity : numeric
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

float, default None

pvlib/scaling.py Outdated

Returns
-------
smoothed : pandas.Series or numeric, depending on inlet type
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

numeric or pandas.Series

pvlib/scaling.py Outdated
import pandas as pd


def wvm(cs_series, latitude, longitude, cloud_speed,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

latitude and longitude are converted to Northing/Easting coordinates, in order to compute distances. I think (but am not sure) that it would be better to input Northing/Easting to wvm and supply a public helper function that converts a list of (lat,long) pairs to a list of (Northing, Easting) pairs.

I'm more sure (but not certain) that a single point points as a list of tuples (northing, easting) is better, This would match how geopandas and shapely store coordinates. In the longer term (not in this PR, I am certain) we could consider extend this interface to accept shapely or geopandas files as input.

Would appreciate other's views here @wholmgren @mikofski

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thinking a bit more about this, the wavelet transform is agnostic about direction - it only cares about distance. So there's no need to maintain any particular orientation for the x,y coordinate axes. The input could be a list of (x, y) coordinate pairs dimensioned in meters.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @jranalli for this contribution. I agree with @cwhanse, a list of tuples makes more sense to me than the original pvl_WVM API of separate latitude and longitude lists. Whether these are (lat, lon) pairs or (x_east, y_north) I'm less sure of.

I did find these existing conversion methods:

So it does seem likePoint syntax is more common. @anomam has experience using Shapely and GeoPandas.

pvlib/scaling.py Outdated
return smoothed, wavelet, tmscales


def _latlon_to_dist(latitude, longitude):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd make this function public and name it latlon_to_xy perhaps

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The methodology here (from Matlab) is very much a "good enough" one to serve as internals of this function, but I don't think it would really be recommended for proper geographic distance calculation. Other packages exist to do it right. If the decision is to make wvm accept easting/northing pairs, and this is made public as helper so that those other packages remain optional, I think it would warrant notes as such in the docs.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. This adds support to switching from lat, lon input to x,y. The _latlon_to_dist function is an added convenience, but not a necessity. Because the WVM is likely being used over relatively small distances (10s of km) I'm not overly concerned about the numerical accuracy here - should be good enough.

Copy link
Member

@mikofski mikofski Nov 12, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My vote is change inputs to discrete pairs of (x_east, y_north) and either discard this function, or as @cwhanse suggests make it public, move it to tools, add disclaimer/warning Cliff suggests: only over short distances, and provide links to pyproj.transform() and other more accurate projection methods

Copy link
Member

@wholmgren wholmgren Nov 12, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't really consider tools to be part of the public API even though it's not explicitly private. I can't think of any instance in which we recommend that users directly call a tools function. I recommend keeping it here unless we know of another instance in which pvlib itself would make use of this function.

In any case, I agree with earlier comments to require distances (x, y) coordinates instead of (lat, lon) coordinates, and I don't actually think pvlib needs this function at all.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In any case, I agree with earlier comments to require distances instead of coordinates, and I don't actually think pvlib needs this function at all.

@wholmgren Are you OK with x,y coordinate pairs as input? Although the coordinates are only used to compute distances, asking for distances as input is less intuitive to me.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry.. yes, updated my unclear comment above.

pvlib/scaling.py Outdated

Parameters
----------
latitude : numeric
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make this input consistent with the input of wvm

Comment on lines 12 to 38
# Sample positions
lat = np.array((9.99, 10, 10.01))
lon = np.array((4.99, 5, 5.01))
# Sample cloud speed
cloud_speed = 5
# Generate a sample clear_sky_index and time vector.
clear_sky_index = np.ones(10000)
clear_sky_index[5000:5005] = np.array([1, 1, 1.1, 0.9, 1])
time = np.arange(0, len(clear_sky_index))
# Sample dt
dt = 1

# Expected distance and positions for sample lat/lon given above
expect_dist = np.array((1560.6, 3121.3, 1560.6))
expect_xpos = np.array([554863.4, 555975.4, 557087.3])
expect_ypos = np.array([1106611.8, 1107719.5, 1108827.2])

# Expected timescales for dt = 1
expect_tmscale = [2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, 2048, 4096]

# Expected wavelet for indices 5000:5004 using clear_sky_index above
expect_wavelet = np.array([[-0.025, 0.05, 0., -0.05, 0.025],
[0.025, 0., 0., 0., -0.025],
[0., 0., 0., 0., 0.]])

# Expected smoothed clear sky index for indices 5000:5004 using inputs above
expect_cs_smooth = np.array([1., 1.0289, 1., 0.9711, 1.])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This section could be converted to a fixture. Could be done in a future PR.

pvlib/scaling.py Outdated
cloud_speed : numeric
Speed of cloud movement in meters per second

method : string
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

plant_type rather than method if we keep the kwarg

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I vote to ditch method plant type/shape in favor of more simplified input you mention above

pvlib/scaling.py Outdated


def wvm(cs_series, latitude, longitude, cloud_speed,
method, capacity=None, density=41, dt=None):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I looked into the method meaning. Basically, it's a convenience to communicate a polygon (or simple square) in lieu of a list of points. In MATLAB, a polygon or square is gridded at some resolution to create a list of points that cover the area, after which the algorithm is the same as if discrete is provided.

My vote here is to remove method and the kwargs supporting the 'square' option (capacity and density). In a future PR we can implement a public function prepare_wvm_inputs that accepts the simpler inputs (e.g. a polygon) and returns the list of grid points. That keeps the wvm function with a single purpose: compute the smoothed clearsky index using the wavelet transform over a grid of points.

Would appreciate other's views here @wholmgren @mikofski

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From my end, I like this suggestion. Wrapping these together was a holdover from Matlab, but given that it's being done in a more compartmented way here, I think keeping this function simple is a better choice.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree.

@jranalli
Copy link
Contributor Author

jranalli commented Nov 9, 2019

Being new to this, I had been assuming I should wait to make any changes until discussion is finished, but I also saw your comment about "moving review ahead." Is there anything I should start addressing (e.g. the obvious docstring corrections), or should I just continue to hang tight?

@cwhanse
Copy link
Member

cwhanse commented Nov 10, 2019

Being new to this, I had been assuming I should wait to make any changes until discussion is finished, but I also saw your comment about "moving review ahead." Is there anything I should start addressing (e.g. the obvious docstring corrections), or should I just continue to hang tight?

The main issue is changing from lat, long to x, y. I'd like others to weigh in.

@mikofski
Copy link
Member

mikofski commented Nov 12, 2019

I haven't looked over the code in great detail, but I agree with @cwhanse API suggestions:

  • change inputs from separate latitude and longitude lists to a single list of point tuples using (x, y)
  • remove the method argument and explain and/or provide helper to generate grid for "square" plant type
  • remove lat/lon(wgs84) to easting/northing (bng) private function, and just input points as (x, y) pairs, and/or possibly make it public helper in tools with caveats and links to other projection conversion methods

@cwhanse
Copy link
Member

cwhanse commented Nov 12, 2019

Being new to this, I had been assuming I should wait to make any changes until discussion is finished, but I also saw your comment about "moving review ahead." Is there anything I should start addressing (e.g. the obvious docstring corrections), or should I just continue to hang tight?

The main issue is changing from lat, long to x, y. I'd like others to weigh in.

@jranalli I think we have consensus on x,y as input rather than lat/long, and on removing the method kwarg. I would use meters as the unit on the coordinates.

I'm still waiting to hear from Matt Lave about a technical review, I'll let you know soon.

@jranalli
Copy link
Contributor Author

These are the API changes that were requested and I think everything is back to working correctly.

@wholmgren
Copy link
Member

What do we need to get this PR across the finish line? A technical review?

@wholmgren wholmgren added this to the 0.7.0 milestone Nov 26, 2019
@cwhanse
Copy link
Member

cwhanse commented Nov 27, 2019

What do we need to get this PR across the finish line? A technical review?

Yes. I'll take care of it today, I don't know when Matt will get to it.

pvlib/scaling.py Outdated
cs_long = pd.DataFrame(cs_long)

# Compute wavelet time scales
mindt = np.ceil(np.log(dt)/np.log(2)) # Minimum wavelet dt
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd prefer min_tmscale instead of mindt, since the minimum is of the time scales not of the time step. Similar preference for max_tmscale over maxdt

Copy link
Member

@cwhanse cwhanse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technical review: this implementation translates well from the papers and from the MATLAB WVM code. I compared some additional calculations between this implementation and MATLAB and get the same results (amplitude series at each timescale) for the same input (within precision of copying between MATLAB and python).

One minor comment about naming internal variables.

There's may be some performance improvements for future work. The MATLAB code calculates the variability reduction factors using a subset of the pairwise distances, when the set of distances becomes large. OK with me to defer these improvements until users request.

Thanks again @jranalli for this contribution and your patience.

@jranalli
Copy link
Contributor Author

@cwhanse My pleasure, thanks to you all for the opportunity to contribute.

The renaming is done. If there's anything else needed from me on this please just let me know.

@cwhanse
Copy link
Member

cwhanse commented Nov 27, 2019

@cwhanse My pleasure, thanks to you all for the opportunity to contribute.
The renaming is done. If there's anything else needed from me on this please just let me know.

Can you merge pvlib-python/master into your branch, and resolve the merge conflict in the whatsnew file? Or you can do it from this page.

@jranalli
Copy link
Contributor Author

I hate to say it, but this kind of thing is where I start pushing the boundaries of my git expertise. I think I did it right, but if not, I might need some guidance.

@cwhanse
Copy link
Member

cwhanse commented Nov 27, 2019

I hate to say it, but this kind of thing is where I start pushing the boundaries of my git expertise. I think I did it right, but if not, I might need some guidance.

Looks good now.

Copy link
Member

@cwhanse cwhanse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wholmgren OK by me to merge

Copy link
Member

@wholmgren wholmgren left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks really good. A few small suggestions below. Let me know if the test suggestions don't make sense.

from pvlib import scaling
from conftest import requires_scipy


Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

anything that's mutable and reused in multiple tests should be put in a pytest.fixture. Comments below try to address this for a few items and I'm hoping you can follow the pattern for the rest.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because of the mutable comment I left the ints outside fixtures. If that's not right, please let me know. otherwise I think I've got it.

# Sample positions
lat = np.array((9.99, 10, 10.01))
lon = np.array((4.99, 5, 5.01))
coordinates = np.array([(lati, loni) for (lati, loni) in zip(lat, lon)])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pytest.fixture
def coordinates():
    lat = np.array((9.99, 10, 10.01))
    lon = np.array((4.99, 5, 5.01))
    coordinates = np.array([(lati, loni) for (lati, loni) in zip(lat, lon)])
    return coordinates

# Must test against central value, because latlon_to_dist uses the mean
coord = (lat[1], lon[1])
pos = scaling.latlon_to_xy(coord)
assert_almost_equal(pos, (expect_xpos[1], expect_ypos[1]), decimal=1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def test_latlon_to_xy_single(coordinates, expect_xpos, expect_ypos):
    # Must test against central value, because latlon_to_dist uses the mean
    coord = coordinates[1]  # not sure about indexing, maybe .T[1]?
    pos = scaling.latlon_to_xy(coord)
    assert_almost_equal(pos, (expect_xpos[1], expect_ypos[1]), decimal=1)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consider same indexing pattern for expect_*pos --> position[1]


def test_latlon_to_xy_array():
pos = scaling.latlon_to_xy(coordinates)
assert_almost_equal(pos, positions, decimal=1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def test_latlon_to_xy_array(coordinates, positions):
    pos = scaling.latlon_to_xy(coordinates)
    assert_almost_equal(pos, positions, decimal=1)


def test_latlon_to_xy_list():
pos = scaling.latlon_to_xy(coordinates.tolist())
assert_almost_equal(pos, positions, decimal=1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def test_latlon_to_xy_list(coordinates, positions):
pos = scaling.latlon_to_xy(coordinates.tolist())
assert_almost_equal(pos, positions, decimal=1)

expect_ypos = np.array([1110838.8, 1111950.8, 1113062.7])

# Sample positions based on the previous lat/lon
positions = np.array([pt for pt in zip(expect_xpos, expect_ypos)])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pytest.fixture
def positions():
     return np.array([pt for pt in zip(expect_xpos, expect_ypos)])


# Expected positions for sample lat/lon given above (calculated manually)
expect_xpos = np.array([554863.4, 555975.4, 557087.3])
expect_ypos = np.array([1110838.8, 1111950.8, 1113062.7])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe move these into the proposed positions fixture below and use indexing to get a single coordinate if needed.

Copy link
Member

@wholmgren wholmgren left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks great, thanks @jranalli!

@wholmgren wholmgren changed the title Create scaling.py and implement draft of WVM model Create scaling.py and implement WVM model Dec 2, 2019
@wholmgren wholmgren merged commit a9ff286 into pvlib:master Dec 2, 2019
cwhanse pushed a commit to JPalakapillyKWH/pvlib-python that referenced this pull request Mar 18, 2020
* create initial files for scaling package

* Completed draft of WVM using only discrete distance algorithm.

* fixed lint errors, moved scipy imports and marked tests @requires_scipy

* added tests to complete coverage as far as I know how

* added lines to the sphinx documentation

* Changes following review. WVM accepts list of xy tuples as inputs. Cleanup of docs.

* Added my name to the list in the what's new docs. Hope this is the intent.

* rename internal variables, truncate comment for lint

* Removed section headings from api.rst to shorten

* Change to try catch block. Added additional test to match.

* convert tests to use fixtures

* fix lint errors

* one more lint error
CameronTStark added a commit that referenced this pull request Mar 28, 2020
* added snow_model code. Needs formatting

* starting numpy-ification

* minor changes

* added tests and made model faster

* finished tests.

* fixed lint errors.

* fixed minor lint error

* fixed bug where model stopped prematurely

* docstring changes mostly. Changed the location of division by 10 for sliding amount.

* fixed tests to account for prev change

* fixed docstring for snow slide amount

* docstring edits, move some functions to private

* rewrite, reduce helpers, remove while loop

* rename functions, review responses, add subhourly test

* fixes

* temperature -> temp_air, test correction, fix default for m

* put m=-80 back

* edits for threshold, use first data point

* add initial snow, better logic for coverage events

* lint, text fix

* workaround for py35/pandas 0.23.4

* move line

* improve comments

* improvements from review

* improve docstring

* reorder comparison, add missing inplace

* correct test, cut/paste error

* vectorize

* correct .clip

* function and file renaming

* docstring corrections, replace hack with pandas offset

* refactor bifacial merge, improve merge tests (#747)

* Simplify merge method.

Simplify merge method by using list and dictionary comprehensions instead of
nested for loops. This also avoids the need to check if some of the elements
of the reports arguments are None, and the need to check if reports contains
more than one report. This should also be a small performance improvement.
This change also adds a bit more detail to the docstring for the merge method.
The creation of a new empty report is also modified to no longer use an
intermediary list of keys.

* Update what's new doc.

* Update whatsnew file syntax.

* Rename whatsnew file to change format to rst.

* Fix linting errors.

* Remove pandas and numpy version reqs

* Update v0.7.1.rst

* Revert "Update v0.7.1.rst"

This reverts commit d131657b18b4c309f709fc3b3931b1d7c51471f3.

* Revert "Remove pandas and numpy version reqs"

This reverts commit 84d718949289c82d6d3e1eb7b0b0fa05c413adf0.

* Reword elements in merge() comprehension.

* Fix handling of None values in merge.

If one of the values in the reports argument that merge() takes is a None,
this is now properly handled by being dropped.

* Remove unnecessary checks in merge().

Returns merge method to its original state with two nested for loops, but without
the checks for the number of dictionaries in the reports argument, or the check
that each element in reports is a dictionary. Reports are only created in the build
method so a test was made to ensure that build always returns a dictionary instead.

* Add tests for merge() and build().

* Fix lint errors.

* Correct import mistake.

Previous commits I made did not import the PVFactorsReportBuilder
class correctly.

* Reinsert type check in merge loop.

* Revert "Reinsert type check in merge loop."

This reverts commit eeb80530ca9b6b1f92a401a44ad80fb43c9e6f7c.

* Fix test_build_1 test.

* Update what's new file.

Add bullet point about added tests for bifacial.py methods.

* Fix merge conflicts in what's new file.

* Revert "Fix merge conflicts in what's new file."

This reverts commit 74c8ebaa5ff81a8202f0a8a4b16571e27d64a6bb.

* Revert "Update what's new file."

This reverts commit 4447e6c5d176a7d10a2a49b64f74b102827b93ce.

* Revert "Rename whatsnew file to change format to rst."

This reverts commit 93d05595b3d88ffa90df89695eb72521dbe536e7.

* Revert "Update whatsnew file syntax."

This reverts commit c969994ff7347779ec27d19318bdbb0374bf7cb3.

* Revert "Update what's new doc."

This reverts commit 55d8f08f8a10c94e363e4e9da558a514417459ce.

* Merge what's new updates from upstream.

* Update what's new file.

* Implement IEC 61853 IAM calculations (#752)

* Insert three function placeholders in pvsystem.py

* transform docstring from matlab

* Code for iam_martin and iam_interp using greek symbol for theta

* Minor improvements.

* Tests for iam_martin_ruiz() and iam_interp()

* Removed undesirables; polished docstrings and code here and there.

* Final stickler.

* Update v0.7.0.rst

* Fix many oversights (also an old one).

* Final(?) documentation touch-ups.

* Fix tests and nan handling.

* Final(?) test fixes.

* Update v0.7.0.rst

* remove python 2 lines (#757)

* Resolve merge conflicts.

The upstream what's new file had been updated.

* Remove py2 and Windows checks.

Removed because python 2 is no longer supported, and this let us remove the
checks for Windows too.

* Remove from __future__ import print_function.

* Remove from __future__ import division.

* Revert "Remove py2 and Windows checks."

This reverts commit a762fc334c50465b6d1952dee1b0c62be5ade2e3.

* Remove has_python2 checks.

* Remove duplicate line in what's new file.

* Remove python2-sensitive imports.

* Remove out-of-date python2 comment.

* Import JSONDecodeError directly in py3 way.

* Change Tkinter import (py2) to tkinter (py3).

* Update import configparser to only py3 version.

* Update documentation comment.

* Remove strict arguments.

The case where these were relevant (Windows build using Python 2) is no
longer used because Python 2 is no longer supported.

* Remove platform_is_windows import.

It is no longer used.

* Set strict arg to True for two tests.

* Fix indentation mistake.

Convert tabs to spaces.

* Revert "Remove from __future__ import print_function."

This reverts commit 0508b33d6c5d77f3050825910415ee3da6477d3c.
This also undoes the changes to how configparser is imported.

* fix rounding issue in linke turbidity lookup function (#755)

* Resolve merge conflicts.

The upstream what's new file had been updated.

* Copy whatsnew file from upstream.

* Remove unnecessary np.around().

Instead of rounding to one decimal place, then comparing to .5,
just compare to .55 right away which preserves the same behavior as
before. If you want to stay true to the comment on line 314, use
<= .5 in comparison instead as before, but still drop the np.around().

* Use <= .5 as comparisons in _linearly_scale.

* Correct previous commit, use <= .5.

* Update what's new file.

Closes #754

* Compare to 0.500001 for margin of error.

* Rename _linearly_scale -> _degrees_to_index, refactor.

This commit reduces inputs to just two, refactors some calculations, and mainly tries
to be much clearer with the documentation.

* Add test for clearsky._degrees_to_index().

Tests that an invalid value for the degree_type argument raises an error.

* Remove try/except/finally.

The exception that is being checked for here is already checked for in the
_degrees_to_index method, so there's no need to do the check again. So this
removes the except portion, and just keeps the try and finally parts.

* Update parameter type description.

* Fix linting errors.

* Fix typo.

* Give more detail in what's new description.

* Rename arg, use context manager, and add more tech. documentation.

* Add missing zero in documentation.

* Break up line so it's < 80 characters.

* Change arg type to float or int.

* Update argument name in test_degrees_to_index_1().

* edit DIRINDEX docstring (#760)

* edit docstring

* add hyperlinks

* added snow_model code. Needs formatting

* starting numpy-ification

* minor changes

* added tests and made model faster

* finished tests.

* fixed lint errors.

* fixed minor lint error

* fixed bug where model stopped prematurely

* docstring changes mostly. Changed the location of division by 10 for sliding amount.

* fixed tests to account for prev change

* fixed docstring for snow slide amount

* Add recombination current params to all bishop88 functions (#763)

* Pass through extra recombination parameters to allow use in max_power_point().

* Add recombination current parameters to the other bishop88 functions.

* Update whatsnew.

* Add tests.

* Tests for both newton and brentq, and some formatting.

* Delete blank line.

* Add scipy requirement triggered by brentq tests.

* Add blank line.

* docstring edits

* redo docstring edits

* add parameterize

* remove blank lines

* Raise if no parameters provided to retrieve_sam() (#770) (#775)

Close #770 via #775

* refactor cec test data into fixtures (#774)

Refactors module and inverter parameter sets used for tests into conftest.py. Parameter sets are hardcoded rather than read from the current version of the SAM data files, since the values in the SAM files may not be stable.

Thanks @peque

* remove functions marked for 0.7 removal (#772)

* remove functions marked for 0.7 removal

* extra line

* move celltemp functions to celltemp.py, expose celltemp.pvsyst in ModelChain (#682)

* move celltemp functions to celltemp.py, fix docstring for pvsystem.PVSystem.sapm

* formatting

* formatting

* initial commit fit_cec_using_sam

* outline ivcurves

* use nrel-pysam

* remove sam_dir

* add Sandia single curve fit

* complete function

* add test, move code to ivtools.py

* remove single files

* fix mock

* add celltemp.pvsyst as a ModelChain option, change order of arguments in celltemp.samp

* lint

* argument order fixes

* another argument order fix

* another argument order fix, change model_params to model in celltemp.pvsyst

* more model_params to model cleanup

* pare back modelchain test

* put test back in to hit ModelChain.pvsyst_temp

* output dataframe from celltemp.pvsyst

* cleanup tests with pvsyst output as dataframe

* fix dataframe creation in celltemp.pvsyst

* fix type checking

* fix test condition

* update api, whatsnew

* comment responses

* change arguments for celltemp.sapm

* fix conflict

* fix kwarg error

* update whatsnew

* test fixes, table formatting

* docstring fixes

* edits to celltemp.pvsyst

* add PVsystem properties, move PVsystem tests back to test_pvsystem.py

* start working on ModelChain

* fix ModelChain tests

* modelchain fixes

* modelchain test fixes

* modelchain test fixes

* more modelchain test fixes

* add infer_temp_model

* lint, test fixes

* yet more modelchain test fixes

* update whatsnew, improve coverage

* fix deprecated test

* fix import

* one more MC test fix

* rename functions, change output type

* change argument names, add parameter_set lookup

* fix function names

* fix function names more

* update modelchain and tests

* more test fixes

* test fixes

* lint

* lint, merge master

* lint, remove pvwatts_dc_pvwatts_ac_system fixture from test_modelchain.py

* rename from celltemp to temperature

* lint, rename test_celltemp.py

* cleanup docstrings

* add invalid test for infer_temp_model

* review comments

* remove unrelated code

* handle deprecated attributes and functions

* test fixes

* fix documentation

* fix infer_temp_model naming, whatsnew

* test fixes and docstring edits

* handle ModelChain deprecated behavior, test type input of _translate functions

* test fixes

* fix tests

* another test fix

* pop unused kwarg

* improve test coverage

* test fixes

* add UserWarning

* improve warnings

* use property setter

* test fixes

* yet another code fix trying to distinguish kwargs from attribute of same name

* still struggling with tests

* reset system between ModelChain init

* infer parameters from racking, module_type

* add the missing colon

* delete vestigial lines

* test fixes

* fix tests, reformat whatsnew section

* test fix

* better handling of deprecated behavior, add consistency check temperature_model vs. temperature_model_parameters

* fix method check

* missing .

* update tests

* remove duplicate test value, clarify TODO comment

* consistent import of tools

* insert blank line

* remove parameter_set arg from PVSystem methods

* lint

* Update module and inverter files (#761) (#767)

* Update module and inverter files (#761)

With data files fetched from NREL/SAM@aa60bc3.

* update whatsnew

* refactor repeated code in ModelChain singlediode and LocalizedPVSystem init (#777)

* Remove duplicate code in ModelChain

* Remove duplicate code in init methods

* Create ivtools (#718)

* initial commit fit_cec_using_sam

* outline ivcurves

* use nrel-pysam

* remove sam_dir

* add Sandia single curve fit

* complete function

* add test, move code to ivtools.py

* remove single files

* change PySCC usage to wrapper

* add nrel-pysam to ci/requirements

* stickler, add pysam version in ci/requirements

* does case matter?

* remove pysam for py27, move import pysam to try/except

* add requires_scipy

* rename variables for style guide

* add test for fit_cec_with_sam

* fix conflict in requirements

* fix typo, raise syntax

* remove ivtools dir

* fix pytest fixture usage

* linter, fix np.allclose

* relax test condition

* remove try/except, add to docstring, use helper functions

* add requires_pysam

* fix import

* fix infinite loop

* update whatsnew, api.rst

* fix lint errors

* lint

* resolve comments

* comment response

* fix function name in test

* more comment responses

* add nan output to fit_cec_sam, comment responses

* lint

* consistent return nan, v_oc arg etc. defaults to None, edit description

* initialize failed

* lint

* implement exception for failed fitting

* remove requirement for sorted i

* update internal variable name for consistency

* add to __init__.py

* code style changes

* lint

* adjust error messaging in _calculate_sde_parameters

* edits to units in docstring

* change error handling in fit_sde_sandia, docstring reformat

* fix tests, docstring edits

* test math formatting

* more edits to docstring

* multiline raw string

* try another approach

* move r

* change function name, docstring edits

* update function name

* add math delimiters

* remove delimiter, use indent

* fix alignment in docstring

* some edits to docstring format

* change name back to sde

* add bad IV curves for coverage test

* separate bad IV test, fix test

* lint

* update api.rst and whatsnew

* hardwire test output rather than read from old CEC file

* test fix

* really fix it this time

* fix raise statement

* clean up whatsnew after #718 merge (#778)

* cleanup

* can't take double credit

* ModelChain 0.7 deprecations, remove times kwarg (#773)

* modelchain deprecations, remove times

* tests

* add requires_tables decorator

* flake8

* remove ModelChain.singlediode

* fix expected value

* update whatsnew

* uncovered line, whatsnew

* feedback

* more explicit docstrings

* add timeout to get_psm3 (#741)

* add timeout

* update whatsnew

* Update forecast.py comment typo

Fix minor typo in comment from DHI to GHI

* Get rid of `re` deprecation warnings (#787)

* Fix documentation references to inverter/module data (#791)

* Fix inverter names in documentation

- to match new cec updates

* Add an example module in cec data file.

* Fix `run_model()` inputs in doc

* Fix old NREL data link

* Update whatsnew

* Fix lint error

* Use Canadian Solar CS5P-220M in examples

* Create iam.py, consistent naming for IAM functions (#783)

* move iam functions to iam.py

* move function tests to test_iam.py

* adjust PVSystem methods, add deprecation for functions and PVSystem methods

* adjust PVSystem tests, test for function deprecation

* move sapm aoi function, adjust ModelChain methods

* remove _ typo

* lint fixes

* move fixture to correct place

* move sapm_module_params fixture to conftest.py

* fix cut/paste errors

* add missing keys to fixture, add missing text to pvsystem.sapm docstring

* fix and update pvsystem.sapm tests

* remove DataFrame test for sapm, lint fixes

* implement PVSystem.get_iam, add deprecation test for pvsystem.sapm_aoi_loss

* lint

* test fixes, add Material to sapm_module_params fixture

* test fixes

* add martin_ruiz to modelchain

* finish adding martin_ruiz to modelchain

* add test for ModelChain.infer_aoi_model, improve coverage

* repair delete mistake

* test for invalid aoi model parameters in ModelChain

* update api, whatsnew

* test fix

* fixture for aoi_model tests

* bad indent

* docstring and lint

* lint

* module docstring, changes to tests

* test fixes

* another test fix

* improve coverage

* fix exception test

* fix the fix

* add bare environment CI tests (#790)

* add bare linux test

* fix test directory

* whatsnew

* moar pytest

* what merge conflict?

* handle warnings from temperature model tests (#796)

* handle warnings in temperature model tests

* test for content of exceptions

* test warning message content

* fix it up

* correct indent on asserts

* fix indent on pop

* fix exceptioninfo reference

* use match kwarg

* use match kwarg throughout

* replace Pandas item() implementation with numpy's using .values (#797)

* replace Pandas item() implementation with numpy's using .values

* force value to np.ndarray to enable universal use of .item()

* mark xfail of test_get_psm3 (#803)

* mark xfail of test_get_psm3

* add line for lint fix

* coefficient estimation method following DeSoto(2006) (#784)

* function getparams_desoto added to pvsystem module. This commit is just the copy from my previous work, slighty modified for removing the PEP8 warnings.

* getparams_desoto moved from pvsystem to singlediode.
- getparams_desoto renamed in getparams_from_specs
- some initializing values were changed in accordance with Duffie&Beckman2013

* - Modification of getparams_from_specs so as it follows better the procedure of DeSoto&al(2006).

* - Bug corrected in DeSoto(2006) procedure
- test_singlediode completed with a beginning of test_getparams_from_specs

* - test_getparams_from_specs_desoto() finished
- function renamed as above

* Not sure of all changes brought by this commit because of holidays.
- function 'from_epw' added in location
- 1 modification in singlediode
- 1 modification in test_singlediode

* - function '_parse_raw_sam_df' modified. The parser engine is now defined on 'python'. If not the pd.read_csv cannot work with me.

* - ModelChain attribute 'diode_params' transformed from tuple containing pd.Series to DataFrame. Makes the use of diode_params easier for further calculations.

* - singlediode.get_params_from_specs_desoto() output changed. 'a_ref' is changed by 'nNsVth_ref'

* read_epw changed. A line has been added to convert the precipitable water from mm to cm, in order to be compatible with other functions of pvlib

* - read_epw changed. If condition added to make the conversion only in the case of TMY3

* - argument diode_params changed from tuple to pd.DataFrame

* - get_params_from_specs_desoto removed from singlediode.py

* - function fit_sdm_desoto added. Still need to be formatted

* - change on type of self.diode_params removed. Go check on branch diode_params_in_df for seeing it

* - Function 'fit_sdm_desoto' cleaned and variables names named as in 'fit_sdm_cec_sam'

* - all changes made on other files than ivtools.py removed (cleaning for comparing before PR)

* - other differences cleaned

* - renaming of one variable and minor documentation modifications

* - Beginning of writting of test_fit_sdm_desoto. Coverage around 90-95% I think

* -minor format changes

* - changes made according to feedbacks of markcampanelli

* - some cleaning on fit_sdm_desoto to make it more readable
- tests completed

* - minor code cleaning

* - check on importation of scipy removed

* - minor cleaning

* - attempt to reach 100% coverage

* - description added in docs/sphinx/source/whatsnew/v0.7.0.rst

* - changes made according to cwhanse review. Except alpha_sc and beta_voc still in %/K

* - minor correction and adaptation of test

* - sign correction on 3rd equation

* - changes on units of alpha_sc and beta_voc inputs. Now in A/K and V/K rather than %/K.
- 'celltype' input replaced by EgRef and dEgdT, with values of Si as default

* - other line of test added for better coverage

* - some changes to try feedbacks of cwhanse and markcampanelli, not finished

* -OptimizeResult added in output
- solver_method removed
- docstring modified
- result.message included in message raised by RuntimeError

* - includes all feedbacks made on the 21/10, except moving of pv_fct() to the module level

* - pv_fct moved out of the fit_sdm_desoto function and renamed in _system_of_equations

* - minor modification: Boltzmann k given in specs to avoid import of scipy in _system_of_equations

* - cleaning and minor modifications to docstring

* - references added to docstring in _system_of_equations

* - modification for removing last lint error

* - modification for removing last lint error

* - modification for removing lint error

* - integration of adriesse suggestions

* - adding of tylunel to the list of contributors

* - adding of mark requires_scipy

* change tools._scalar_out/_array_out arg name to avoid collision with builtin input function (#800)

* replace Pandas item() implementation with numpy's using .values

* force value to np.ndarray to enable universal use of .item()

* rename function parameters to arg instead of input

* convert last input to arg

* remove extra lines

* Implement IEC 61853 IAM calculations for diffuse irradiance (#793)

* Functional new function.

* Undo

* Working function; partial docstring; no tests.

* Complete docstring; simplify a bit; start tests.

* Getting better all the time!

* Adjust some comments and update whatsnew.

* Improvements as per review.

* More changes as requested.

* Fun with stickler.

* Docstring corrections.

* Remove all controversial code.

* add macOS 10.14 Mojave to Azure Pipelines CI (#812)

* update readme docs to stable

* Add note clearksky Ineichen term b (#814)

Thanks!

* Drop DataFrame as option for `module` input to pvsystem.sapm (#811)

* update docstring for pvsystem.sapm

* update whatsnew

* minor edit

* Don't expose model parameter dictionaries to users (#805)

* Don't expose model parameter dicts to users

* Update whatsnew

* Update whatsnew for private dictionaries

* * Fix for issue #782 (#816)

- Validity checks with new kwargs min_pw and max_pw
- Note added in docstring of epw.read_epw() concerning TMY3.epw files from energyplus.net
- corresponding tests

* Add shield organization table with download shields (#820)

* create shield table, insert coverage info

* flesh out table, include all previous badges, plus downloads

* clean up

* replace Azure Pipelines shield with pvlib's urls

* add conda-forge version and last updated

* fix reading MIDC files with mismatching header/data columns (#822)

* fix reading MIDC files with mismatching header/data columns

* Update pvlib/iotools/midc.py

Co-Authored-By: Will Holmgren <[email protected]>

* add timeout argument

* Change units on SAPM effective irradiance from suns to W/m2 (#815)

* change effective_irradiance units in sapm, change sapm_effective_irradiance function

* whatsnew

* docstring additions

* lint

* remove /1000 from ModelChain.sapm, fix pvsystem tests

* test fix, docstring edits

* change irrad_ref to suns, with deprecation message

* remove suns option, new kwargs

* remove reference_irradiance from tests

* fix method test

* update warning

* separate API breaking changes

* fix handful of documentation warnings (#819)

* fix handful of documentation warnings

* more fixes

* update contributing documentation, pr template (#818)

* update contributing documentation

* update pr template

* add links to installation [skip ci]

* move sentence, add link

* import bifacial module in __init__.py (#826)

* import bifacial module

* flake8

* Put SAM product renaming code in a separate function, simplify, add warning. (#753)

* Put product renaming code in a separate function, simplify, add warning.

* Add test and pacify stickler somewhat.

* Add one comment

* u -> o

* Create scaling.py and implement WVM model (#807)

* create initial files for scaling package

* Completed draft of WVM using only discrete distance algorithm.

* fixed lint errors, moved scipy imports and marked tests @requires_scipy

* added tests to complete coverage as far as I know how

* added lines to the sphinx documentation

* Changes following review. WVM accepts list of xy tuples as inputs. Cleanup of docs.

* Added my name to the list in the what's new docs. Hope this is the intent.

* rename internal variables, truncate comment for lint

* Removed section headings from api.rst to shorten

* Change to try catch block. Added additional test to match.

* convert tests to use fixtures

* fix lint errors

* one more lint error

* Fix typo in TMY total sky cover uncertainty column name (#831)

* Fix Typo in in Cloud cover uncertainty column name

* Fix alignment in docstring

* Location object creation from epw metadata (#821)

* * function location.Location.from_epw added. It permits to create a Location object from metadata of a epw file, typically retrieved with epw.read_epw().

* - Apply renaming suggestions with metadata and data

* Implement IEC 61853 module temperature model (#834)

* First functional function.

* Fixed defaults and docstring; added tests and sphinx entries.

* Small but important docstring changes.

* update to numpy-1.12.0 (#830)

* update to numpy-1.12.0

* update what's new to require numpy >= 1.12.0

* update both pandas and numpy requirements

paint the bikeshed white: https://imgs.xkcd.com/comics/timeline_of_bicycle_design.png

* Docstring formatting (#833)

* improve docstring sphinx rendering in ModelChain methods

* improve docstring sphinx rendering in Location.from_tmy

* improve docstring sphinx rendering in tracking.singleaxis

* improve docstring sphinx rendering in iotools.midc.read_midc

* improve docstring sphinx rendering in ivtools functions

* remove duplicate pvwatts_losses entry in api.rst

* improve docstring sphinx rendering in clearsky functions

* add whatsnew entry

* update formatting of "Assigns attributes" info in modelchain.py docstrings

* fix capitalization in iotools.midc.read_midc

* improve docstring formatting for return value dict keys in ivtools.fit_sdm_desoto

* simplify Location.from_tmy docstring

* pin sphinx version at 1.8.5 in setup.py

* make solarposition.py footnotes compatible with numpydoc

* make clearsky.py footnotes compatible with numpydoc

* make atmosphere.py footnotes compatible with numpydoc

* make irradiance.py footnotes compatible with numpydoc

* make iam.py footnotes compatible with numpydoc

* make temperature.py footnotes compatible with numpydoc

* make pvsystem.py footnotes compatible with numpydoc

* make singlediode.py footnotes compatible with numpydoc

* make ivtools.py footnotes compatible with numpydoc

* make tracking.py footnotes compatible with numpydoc

* make tmy.py footnotes compatible with numpydoc

* make epw.py footnotes compatible with numpydoc

* make srml.py footnotes compatible with numpydoc

* make surfrad.py footnotes compatible with numpydoc

* make psm3.py footnotes compatible with numpydoc

* fix typo in pvsyst_cell docstring footnote

* add sphinx suppress_warnings entry for unreferenced footnote warnings

* make the stickler happy

* fix improper reference format in pvsystem.calcparams_pvsyst docstring

* remove TMY module from api.rst and add pvsystem.adrinverter

* add link to github issue about suppressing sphinx warnings

Co-Authored-By: Will Holmgren <[email protected]>

* fix lint issue in pvsystem.py and make pvsystem.adrinverter docstring reference numpydoc compliant

* *� Formatting of ModelChain.diode_params in pandas.DataFrame (#832)

Changes ModelChain.diode_params from tuple to DataFrame

* update whatsnew.rst, add contributors, v0.7.0 release date, address #809 (#839)

* update whatsnew.rst, add contributors, v0.7.0 release date, address #809

* fix dotted line to match date line's length

* change overlooked irradiance.total_irrad references to irradiance.get_total_irradiance (#843)

* refactor get_psm3 code into parse_psm3, read_psm3 (#842)

* Split out PSM3 parsing code into its own function read_psm3; update docstring to include new PSM3 metadata fields

* update tests and code for read_psm3 and get_psm3

- get_psm3: add leap_day parameter
- get_psm3: change API endpoint for TMY requests
- get_psm3: update allowed names list to include 2018 datasets
- get_psm3: add single-year test
- get_psm3: change existing test to test TMY data only, remove invalid interval test
- read_psm3: fix warning of blank columns from excel CSVs
- read_psm3: change dtype of cloud type and fill flag to int
- read_psm3: add tests for reading filename and file-like object

* Create whatsnew for 0.7.1, add read_psm3 entry to api.rst

* add read_psm3 to iotools/__init__.py

* Clean up NSRDB PSM3 API urls

* Change read_psm3 to parse_psm3, remove ability to read files, add small example of reading files

* Add read_psm3 back in, refactor psm3 tests to reduce code duplication

* fix typo in parse_psm3 documentation

* Fix error in the irradiance unit checker. (#844)

* Add gallery of examples using sphinx-gallery (#846)

* draft of sunpath diagram example

* add explanation text and cartesian diagram

* Delete solarposition.rst

* Add sphinx-gallery and configuration

* add sunpath example

* add singleaxis tracker example

* linting fixes

* rename "Intro Examples" to "Intro Tutorial", move "Example Gallery" up in the toctree

* Update introtutorial.rst

* more linting fixes

* last linting fix, probably

* Mention example gallery in whatsnew

* update introtutorial link in package_overview.rst; move warnings import to top of conf.py

* add pvgis to iotools (#845)

* add get_pvgis_tmy to iotools

* stickler fixes maybe?

* add pvgis tests

- upload test data
- check the JSON output format
- add get_pvgis_tmy to api
- add timeout and url args as suggested
- add raises to explain error messages
- correct user horizon docstring decription

* fix stickler

* STY: fix indents and long lines in test meta JSON

* check each column of output separately

* add test for basic output format

* check error raised if bad output format
* must use NamedTemporaryFile with delete=True
* add test for csv output format

* add test for epw output format

* add parse_epw to use buffer v. filepath

* add test data for pvgis tmy userhorizon
* add parse_epw to iotools api
* add tests for usehorizon & userhorizon
* check meta from epw
* in test_epw, meta not used so replace with _
* pvgis-tmy-epw don't use temp files anymore

* fix use lowercase url in request

* test startyear and endyear
* test a bad url, returns 404

* use assert cond is False instead of ==, thx stickler

* update api docs and what's new

* fix typo in Parameters

* use with context manager for IO

* instead of try:finally
* fix defaults notation
* add reference links and improve docstring,

* use fixtures for test_pvgis

* also move META dict to a json file in pvlib/data

* fix reference links

* fix open for py35

* also add references header in pvgis.py

* remove whitespace on blank line

* initialize data to None in case API fails to respond to bad outputformat

* fix table line lengtsh, remove #noqa, use grid (#852)

* Improves sapm deprecation warning checker  (#854)

* improve warning condition

* fix indent

* overthinking this

* handle different types

* stickler

* fix test logic

* hush, stickler

* improve solpos tz requirements documentation (#853)

* improve solpos tz requirements

* Update pvlib/solarposition.py

Co-Authored-By: Cliff Hansen <[email protected]>

* Update pvlib/solarposition.py

Co-Authored-By: Cliff Hansen <[email protected]>

Co-authored-by: Cliff Hansen <[email protected]>

* DOC: remove superscript formatting for citation callouts (#855)

* remove superscript formatting for footnotes via css override

* update conf.py comments

* reorganize tests into subfolders and use pathlib for conftest DATA_DIR (#859)

* reorganize tests into subfolders

* closes #848
* create pvlib/tests/iotools and move all iotools tests into that
subfolder
* use fixtures in test_ecmwf_macc.py for expected_test_data
* use conftest.data_dir instead of redundant boilerplate process of
retrieving test folder for each tests, mostly for iotools
* fix test_ivtools.py was using full path to conftest but tests is not a
python package, so no
* change "pvlib/test" -> "pvlib/tests" to conform to popular convention

Signed-off-by: Mark Mikofski <[email protected]>

* in azure pipelines, pytest pvlib only, not pvlib/test

* Update v0.7.1.rst

* Update .codecov.yml

* Update docs/sphinx/source/whatsnew/v0.7.1.rst

with suggestion from Will

Co-Authored-By: Will Holmgren <[email protected]>

* use pathlib, remove os, inspect

* some iotools expect string to check for url using startswith() - so
  stringify path objects first
* in midc, use requests.get params arg to build querystring instead of
  doing it manually
* a couple of places chnage constants to all CAPS, some other places
  make a fixture, sometimes nothing, not consistent
* also in test_midc, comment out unused URL, was it part of a test once?

* path objects only work seemlessly on Python-3.6 or later

* stringify a few more path objects

* in psm3.read_psm3 still expects a string filename
* tmy3 and tmy2 also could be strings

* last two: stringify epw and surfrad

* fingers crossed!

* fix typo, mention pathlib in what's new

Co-authored-by: Will Holmgren <[email protected]>

* add Boyle/Coello (Humboldt State Univ) soiling model (#850)

* initial_commit_Boyle_Coello_soiling_model

* stickler-ci_correction

* E128_Error

* E128_Error

* E128_Error

* format_corrections

* updated soiling_hsu

* updated soiling_hsu

* added unit test

* added unit test

* added unit test

* added unit test

* added unit test

* corrections_to_test_losses

* corrections_to_test_losses

* cleaning Test function

* cleaning Test function

* cleaning Test function

* update api.rst, whatsnew

* merge_correction

* merge_corrections

* updated_init

Co-authored-by: Cliff Hansen <[email protected]>

* update whats new for #844 and other contributors (#851)

* update whats new for #844 and other contributors

* rebase

* epw docstring noqa fix

* update whatsnew date

* add anton. minor api.rst clean up

* escape asterisk

* add docs auto_examples to gitignore

* pep8

* add numfocus affiliation to sphinx docs (#862)

* add numfocus affiliation to sphinx docs

* fix link to use rst, not markdown

* no width on numfocus image

* Create v0.7.2.rst

* include what's new for v0.7.2

* fix typos in what's new for v0.7.2

* ghuser directive missing colon
* missing title and reference directive
* use pre-established heading conventions like tildes and single dashes

* use my full name

* fix tmy3 leapyear in February handling, coerce_year raises exception (#866)

* create test for GH865

* checks if there is a 2/29 in the parsed output
* checks if can use coerce year

* check it's really greensboro

* also check that year is actually coerced

* check if the TMY Feb contains a leapyear

* then change 2/29 to 3/1
* add sanity check, indices should increase monotonically, by 1hr,
  except perhaps the last hour, which might be the previous year.
* add FIXME for coerce_year, should last hour be Jan1 of the next year?

* add data file, update what's new

* stickler space around operator in test

* move warning after notes

* oops, copy and paste error

* remove redundant Greensboro meta check

* use improved snippet from Kevin Anderson

* remove dateutil import
* remove _parsedate

* use 'h' unit string in pd.to_timedelta for min req pandas-0.18.1

* fix first and last hour issue are 1AM and midnite

* elaborate on warning regarding definition of TMY and possibly
  unexpected behavior when coercing the year
* add more comments
* use mod instead of subtracting hour to change midnite from 24 to zero
* shift all midnite rows one day forward
* check for leapday (2/29) and shift to March 1st
* add copious comments and a note that if pandas is updated to 0.24
  then we can use the `array` property to detect leapday
* add test to confirm that TMY3 1st hour is 1AM and last hour is midnite

* remove unneeded numpy import in iotools/tmy

* update what's new to explain that tmy3 dataframe now has original Date
  and Time columns used to parse the indices

* more explicit test for 2/29/1996 in tmy3

* wait to set index to avoid error in py35-min

* ValueError: cannot reindex from a duplicate axis

* try pandas datetime instead of index

* no need to convert data_ymd to datetime

* it ALREADY IS! ha ha ha

* Update for PySAM 2.0 (#874)

* 'none' to None

* require pysam 2.0 or later

* docstring edits, move some functions to private

* make test_psm3.py robust to API overuse errors (#873)

* use NREL_API_KEY environment variable when available

* remove try-except clause to ensure that azure env variable is used

* set environment variable from Azure's secret variables

* Update azure-pipelines.yml for Azure Pipelines

* Update azure-pipelines.yml for Azure Pipelines within pytest call

* Update azure-pipelines.yml for Azure Pipelines echo only

* Update azure-pipelines.yml for Azure Pipelines with python

* Update azure-pipelines.yml for Azure Pipelines all os.environ

* Update azure-pipelines.yml for Azure Pipelines export variable

* export Azure secret variable into environment to access from python

* Update azure-pipelines.yml for Azure Pipelines fix displayName

* Update azure-pipelines.yml for Azure Pipelines psm3 test only

* Update azure-pipelines.yml for Azure Pipelines for real this time

* Update azure-pipelines.yml for Azure Pipelines for really real this time

* Update azure-pipelines.yml for Azure Pipelines single failing test

* Update azure-pipelines.yml for Azure Pipelines test_get_psm3_tmy

* Update azure-pipelines.yml for Azure Pipelines io_input

* Update azure-pipelines.yml for Azure Pipelines test_parse_psm3

* Update azure-pipelines.yml for Azure Pipelines test_read_psm3

* Update azure-pipelines.yml for Azure Pipelines all test_psm3.py tests

* Update azure-pipelines.yml for Azure Pipelines all python versions

* use NREL_API_KEY environment variable when available

* remove try-except clause to ensure that azure env variable is used

* set environment variable from Azure's secret variables

* Update azure-pipelines.yml for Azure Pipelines

* Update azure-pipelines.yml for Azure Pipelines within pytest call

* Update azure-pipelines.yml for Azure Pipelines echo only

* Update azure-pipelines.yml for Azure Pipelines with python

* Update azure-pipelines.yml for Azure Pipelines all os.environ

* Update azure-pipelines.yml for Azure Pipelines export variable

* export Azure secret variable into environment to access from python

* temporary test_blah function and get_psm3 delayer

* improve delay_get_psm3(), replace get_psm3() calls

* set DEMO_KEY to environment variable if available

* update azure-pipelines basic script for testing

* match azure secret variable

* return to full suite of testing on azure

* return test_psm3() specific pytest call to general

* add description to whatsnew

* formatting fix

* logger module implementation

* more formatting

* change while loop to for loop

* use warnings instead of logging

* fixturize DEMO_KEY

* implement use of pytest-rerunfailures instead of custom loop

* parametrize error tests into a signle test

* fix windows environment import of NREL key

* test to echo environment variable of NREL key

* fix windows secret variable setting

* add pytest-rerunfailures to ci requirements yml

* use secret.nrelApiKey syntax

* test new secret variable

* try secret.api_key syntax

* std syntax only psm3

* try to print other variables that arent secrets

* confirm personal repo still works

* bump

* try all python versions

* try everything

* install pytest-rerunrailures in Azure

* configure conda_35 tests to pull pytest_rerunfailures from pip instead of conda

* add note for pytest-rerunfailures for conda_35

* change fixture from DEMO_KEY to nrel_api_key

* remove debug variable echo commands

* move pytest-rerunfailures to pip install

* improve docstrings

* use verbose name for caught error

* readd leap year test

* Fix backwards path sep (#876)

Fixes #875

* forecast compat with pandas 1.0, fix bug in Location tz handling (#879)

* fix location tz bug with datetime.timezone.utc

* update whatsnew

* issue in whatsnew

* fix whatsnew class typo

* remove needs_pandas decorator (#885)

* remove definition and applications of pandas_0_22 decorator

* remove needs_pandas_0_17

* Fix backwards path sep (#876)

Fixes #875

* readd needs_pandas_0_22 for test_irradiance.py in travis py35-min

* forecast compat with pandas 1.0, fix bug in Location tz handling (#879)

* fix location tz bug with datetime.timezone.utc

* update whatsnew

* issue in whatsnew

* fix whatsnew class typo

* remove definition and applications of pandas_0_22 decorator

* remove needs_pandas_0_17

* readd needs_pandas_0_22 for test_irradiance.py in travis py35-min

Co-authored-by: Mark Mikofski <[email protected]>
Co-authored-by: Will Holmgren <[email protected]>

* rewrite, reduce helpers, remove while loop

* Add calcparams_desoto+singlediode example to gallery (#872)

* Create plot_singlediode.py

* Apply suggestions from code review

Co-Authored-By: Cliff Hansen <[email protected]>

* RST cleanup, reorder legend entries

* Formatting fixes

* whatsnew

* whatsnewer

* sorry, one formatting fix

* stickler

Co-authored-by: Cliff Hansen <[email protected]>

* fix documentation home page title (#890)

* fix documentation home page title

* fix line length

* add to what's new

* rename functions, review responses, add subhourly test

* fixes

* temperature -> temp_air, test correction, fix default for m

* put m=-80 back

* add Kimber soiling model (#860)

* add Kimber soiling model

* remove spaces, thx stickler

* typos: aa = a, rest = reset

* move soiling to losses

* add test for kimber soiling model

* also add a gallery example draft

* add kimber to api.rst

* use pd.to_datetime for plt
* start to add manwash test using datetime

* add test for manual wash

* add test data for manual wash
* try to fix example plot using to_pydatetime

* please fix kimber gallery example

* Fix indent on references

* add test for no rain, max soil

* reuse rainfall events
* combine zero soil assignments

* fix typos in Kimber soiling model

* correct Adrianne spelling
* add table of soiling rates from figure 3 from Kimber paper
* fix reference indents in docstring

* update what's new w/ kimber model

* fix table malformed table in losses
* fix long line in docstring

* Apply suggestions from code review

* change `rainfall_timeseries` input arg to just `rainfall`
* clarification of rainfall input, put units in brackets at end of definition
* change `threshold` input arg to `cleaning_threashold`
* change `soiling_rate` input arg to `soiling_loss_rate` and clarify definition as energy loss putting units [unitless] in brackets at the end of the definition
* remove suggestion regarding grace period to the Notes section
* fix formatting of return, there is no name, only return type
* improve notes section to add that grace period may also vary by region and to use energy loss rate instead of soiling since it's not a mass rate

Co-Authored-By: Cliff Hansen <[email protected]>

* update rainfall and threshold args in kimber

* to match docstring improvement suggested in code review
* wrap long lines
* be more consistent with HSU docstring

* change soiling_rate to soiling_loss_rate

* combine panels cleaned today with grace-period,
 since it includes today

* add a 1-liner for `soiling_kimber` in the sphinx docs

* also fix missing "for" in note

Co-Authored-By: Kevin Anderson <[email protected]>

* separate tests for Kimber

* remove extra whitespace

* delete trailing whitespace in kimber test

* vectorize kimber soiling

* use a rolling window and rain_accum_period to get more accurate rain coverage
 overnite instead of only discrete days from midnite to midnite
* add rain_accum_period to allow deviation from kimber model, recommends
only 24h but why not be flexible?
* add istmy=False to fix last hour of tmy to be monotonically increasing
* add another test for initial condition
* fix example input arg cleaning_threshold

* fix trailing whitespace in kimber

* Kimber needs pandas-0.22

Co-authored-by: Cliff Hansen <[email protected]>
Co-authored-by: Kevin Anderson <[email protected]>

* move contents of 0.6.4 whatsnew into 0.7.0 whatsnew (#899)

* fix indentation

* add notes to 0.7.0 and 0.7.2 files

* remove unreleased 0.6.4.rst

* compatibility for cftime==1.1 (#900)

* compat for cftime 1.1

* unused import

* whats new

* Use pytest remotedata (#896)

* enable pytest_remotedata for psm3

* add pytest_remotedata to test_pvgis.py

* add @pytest.mark.remote_data to other network dependent tests

* use up-to-date numpy, pandas

* install netcdf4 from pip instead of conda

* install pytest-remotedata from pip for conda 3.5

* readd @network to test_midc.py

* add pytest-remotedata to py35-min for travis

* add to what's new

* point coverage at run with --remote-data

* add --remote-data to travis run for coveralls to pickup

* Revert "point coverage at run with --remote-data"

This reverts commit e73eeb02db17f930c1958c219d2bdf38b557c726.

* Update docs/sphinx/source/whatsnew/v0.7.2.rst

Co-Authored-By: Will Holmgren <[email protected]>

* Update docs/sphinx/source/whatsnew/v0.7.2.rst

Co-Authored-By: Will Holmgren <[email protected]>

* Revert "install netcdf4 from pip instead of conda"

This reverts commit 6fa8ecfa89a829c9a81fec566b2f5e97b9ecda38.

* fix linelength in what's new

* add pytest-remotedata to Azure MacOS

* move --remote-data tests to conda_linux

* add remote_data decorators to tests requiring network

* only install pytest-remotedata from pip, explain

* fix tests directory name and url

* document use of --remote-data flag in Contributing>Testing

Co-authored-by: Will Holmgren <[email protected]>

* Include Python3.8 into Azure Pipelines (#904)

* add Python3.8 to azure config

* fix version name

* add requirements-py38.yml config

* add to what's new

* fix what's new pytest-remote issue and pull reference

* eliminate some of the test suite warnings (#906)

* handle sam_data fixture duplicate entries warnings

* pep8

* force numpy within detect_clearsky to avoid pandas indexing deprecation

* martin_ruiz_diffuse runtimewarnings

* losses dtype, pd.datetime

* whats new

* Add Contributing section about gallery examples (#905)

* add section about gallery examples to contributing guide

* whatsnew

* whatsnew cleanup

* add review suggestion

* Expose temperature.faiman in PVSystem and ModelChain  (#897)

* remove unused pvsyst_celltemp parameter docs

* fix up temperature.faiman docstring formatting

* add pvsystem.PVSystem.faiman_celltemp

* add faiman model to ModelChain

* fix copy/paste error

* add test

* whatsnew

* fix whatsnew

* add GH links to whatsnew entry

* split up MC cell temp tests; add faiman test

* add faiman to MC infer_temp_model test; change test to check inferred model

* add MC.faiman_temp to api.rst

* delete duplicate pvsystem test

* Rename test_modelchain system fixture (#915)

* Rename `system` fixture

* Update whatsnew

* Fix lint errors

* BLD: build docs on Azure Pipelines (#909)

* try building docs on Azure Pipelines

* add doc.yml conda env file

* specify conda env name

* fix doc -> docs

* include sphinx-gallery

* add pytest to env

* install library to run sphinx-gallery

* install full suite of pvlib dependencies

* remove 'warnings as errors' flag

* return standard tests to azure-pipelines.yml

* fix job spacing

* add to what's new

* remove unused doc.yml config

* use requirements-py38.yml for docbuilding

* remove conda environment

* more conda removal

* specify python 3.8

* additional cleanup

* test docs before publish, rename env build

* fix read_tmy3 with year coerced not monotonic, breaks soiling (#910)

* fix soiling_hsu for tmy not monotonic

* if tmy is coerced to year, then the last hour is in the same year,
 before the other timestamps, so timestamps are no monotonic
* closes #889
* add underscore to is_tmy
* add is_tmy to soiling_hsu
* add private _fix_tmy_monotonicity
* also fix reference footnote in first line of docstring,
 since used in api.rst, causes doc warning
* don't use mutable input args
* don't create pd.timedelta object every single time soiling_hsu() called
* always create soiling pd.Series for consistency and robust to future edits
* fix rain_index is not a tz-aware pd.DatetimeIndex series,
 so rename as rain_index_vals and use rainfall.index in Soiling
* add new test for gh889, check soiling ratio for 24 hours,
 and check timestamp of first and last hours, and last day

* update what's new with fix for TMY3 in soiling_hsu

* also change 1st line of kimber soiling docstring, to say calculates
 energy loss, not soiling ratio, since kimber is 1-hsu,
  they're opposite!

* move fix_tmy_monotonicity to tmy in iotools

* and remove it from losses, remove is_tmy args, and remove calls to
 fix tmy monotonicity
* test to make sure if coerced year, last record is Jan 1 of next year
* add fix_tmy3_coerce_year_monotonicity to iotools api
* revert changes to soiling_hsu, don't use private constants for
 rain accumulation period or deposition velocity
* update kimber soiling example in gallery
  - to use fix_tmy3_coerce_year_monotonicity
  - and not use conftest, instead build path
  - also better comments
* remove test for gh889 in test_losses
* apply fix to greensboro rain data fixture and remove is_tmy args from kimber tests
* add needs_pandas_0_22 to soiling_hsu tests to show that it is required,
 since requies_scipy masks the requirement

* fix path to data in kimber example

* add test for HSU soiling defaults for coverage
* fix iotools api stickler warning
* remove unused pytz import in test_losses

* update what's new and api.rst with fix_tmy3_coerce_year_monotonicity

* parameters spelling typo in fix_tmy3_monotonicity docstring

* also move reference in kimber soiling example to comment in first code cell

* add monotonic_index kwarg to read_tmy3

* change name to tmy3_monotonic_index
* set monotonic_index default to false but warn that
 this will change to True in v0.8
* improve tmy3_monotonic_index docstring with reviewer comments, and add
 note to call only after read_tmy3 with coerce year,
  to use monotonic_index=True to combine calls, and that there's no validation
  that input TMY3 has been coerced or if it has already been fixed
* also if only used for TMY3 no need to calculate timestep interval,
 it's always 1-hour
* that's all folks

* update what's new with monotonic_index kwarg

* Update pvlib/tests/iotools/test_tmy.py

* change test name to `test_tmy3_monotonic_index`

Co-Authored-By: Cliff Hansen <[email protected]>

* fix bug in read_tmy3 when year coerced

- so the indices are monotonically increasing
- add note to API changes and update bug fix in what's new
- update docstring in read_tmy3
  * explain in coerce_year parmaeter that all of the indices except the
  last one will have this, value, but the last index will be the value+1
- update the warnings to explain that if the year is coerced that the last index
 will be the next year
- do coerce year on date_ymd which is a Series of Timestamps, not an
 index so mutable (can assign by integer index) which makes it trivial
 to change the last index to be the next year
- revert most of the changes from the previous commits in this PR
  * remove `tmy3_monotonic_index` from `api.rst`, `iotools/__init__.py`, and from `tmy.py`
  * remove `monotonic_index` kwarg from read_tmy3
  * remove use of `tmy3_monotonic_index` from sphinx gallery example,
   test_losses, and test_tmy
  * remove test for tmy3_monotonic_index and the monotonic_index kwarg
  * fix test_tmy test to treat last index differently, except for the
  leap year test which now has a constant diff, yay!

* stickler: rm unused imports from tmy, test

* suggested changes in read_tmy3 for monotonic index

* sed s/data/index/g
* wordsmithing docstring: coerce_year kwarg
* wordsmithing what's new bug fixes
* also consistently use periods for bullets in what's new

* Apply suggestions from code review to test last hour in read_tmy3

* and minor wordsmithing in what's new

Co-Authored-By: Will Holmgren <[email protected]>

Co-authored-by: Cliff Hansen <[email protected]>
Co-authored-by: Will Holmgren <[email protected]>

* add read pvgis tmy (#907)

* add read_pvgis_tmy

* for files downloaded from pvgis tool
* update tests, api, and docs
* allow commit .csv to pvlib/data

* update what's new with read_pvgis_tmy

* closes #880
* test import from api vs. full namespace path

* py35 no likey fstring in read_pvgis_tmy

* don't use is for string comparison, ditto
* state returns, add see also

* use str in json.load in read_pvgis

* test read_pvgis_tmy_json, don't use binary buffer

* try parse_epw in read_pvgis_tmy first

* since attribute error only occurs for str/path

* more comments, space between pvgis outputformats

* raise ValueError if unknown output format
* test if ValueError raised for bad output format

* trailing new line in pvgis.py

* use `pvgis_format` instead of confusing arg

- infer parser format from file extension for csv, epw, and json
- much more detailed docstring for `pvgis_format`
- update tests for inferring format from extensions as well as failing
 if basic output format or file buffer isn't explicit

* Apply pvgis suggestions from code review to docstring

thanks Cliff!

Co-Authored-By: Cliff Hansen <[email protected]>

* wrap long lines in read pvgis docstring

* wordsmithing and formatting, hope it's okay
* don't need to index filename to get pvgis_format,
 just use rsplit(',', 1)[-1] to file extension

* use pathlib instead of rsplit in read_pvgis

- TypeError raised if filename is buffer, not ValueError

* python 3.5 has differnet messae for pathlib

- py>3.5: expected str, bytes or os.PathLike object
- py=3.5: argument should be a path or str object

Co-authored-by: Cliff Hansen <[email protected]>

* TST: make iotools tests robust to API downtime (#919)

* add pytest.mark.flaky() to remote_data()

* move CI in what's new to Testing section

* add to what's new

* use RERUNS, RERUNS_DELAY variables to set flaky calls

* add imports to test_pvgis.py, punctuation

* Link to code of conduct (#922)

* Link to code of contact

* Update docs/sphinx/source/contributing.rst

Co-Authored-By: Cliff Hansen <[email protected]>

* remove CoC contact proj-team and shell prompts

* update what's new re: CoC link in contributing

Co-authored-by: Cliff Hansen <[email protected]>

* edits for threshold, use first data point

* Fix most sphinx warnings (#912)

* fix underline in examples/plot_singlediode.py

* move references out of the first sentence

* pin docutils at 0.15.2 in doc requirements

* Fancy "view on github" links in documentation (#913)

* try out fancy GH linking

* comment breadcrumbs.html

* fix syntax

* refactor GH link logic out of the template and into conf.py

* typo

* add initial snow, better logic for coverage events

* lint, text fix

* TST: Use templates in Azure Pipelines config file (#926)

* try posix.yml template for bare linux and mac

* move conda tests to templates

* add to what's new

* remove commented flake check

* workaround for py35/pandas 0.23.4

* move line

* improve comments

* improvements from review

* improve docstring

* reorder comparison, add missing inplace

* correct test, cut/paste error

* vectorize

* correct .clip

* function and file renaming

* docstring corrections, replace hack with pandas offset

* update api.rst, whatsnew

* fix headings

* change to use pandas to_offset method

* Update docs/sphinx/source/api.rst

Co-Authored-By: Kevin Anderson <[email protected]>

* Update docs/sphinx/source/whatsnew/v0.7.2.rst

Co-Authored-By: Kevin Anderson <[email protected]>

* Update pvlib/snow.py

Co-Authored-By: Kevin Anderson <[email protected]>

* review comments

* add snow to __init__.py

* add test for irregular times

* correct user name

Co-authored-by: Cliff Hansen <[email protected]>
Co-authored-by: Alexander Morgan <[email protected]>
Co-authored-by: Anton Driesse <[email protected]>
Co-authored-by: Miguel Sánchez de León Peque <[email protected]>
Co-authored-by: Will Holmgren <[email protected]>
Co-authored-by: Adam Peretti <[email protected]>
Co-authored-by: Veronica Guo <[email protected]>
Co-authored-by: Cameron Stark <[email protected]>
Co-authored-by: tylunel <[email protected]>
Co-authored-by: Cedric Leroy <[email protected]>
Co-authored-by: Tony Lorenzo <[email protected]>
Co-authored-by: Joe Ranalli <[email protected]>
Co-authored-by: Hamilton Kibbe <[email protected]>
Co-authored-by: Mark Mikofski <[email protected]>
Co-authored-by: Kevin Anderson <[email protected]>
Co-authored-by: Kevin Anderson <[email protected]>
Co-authored-by: dzimmanck <[email protected]>
Co-authored-by: Valliappan_CA <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Wavelet Variability Model (WVM) for calculating spatial smoothing of irradiance
6 participants