Skip to content

Implement incomplete beta function #2678

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

domenzain
Copy link
Contributor

@domenzain domenzain commented Oct 30, 2017

This implementation is based on the Cephes library's implementation by Steve
Moshier.

This completes the work started in #2048 and continued in #2073 . With it all continuous distributions have working implementations of log CDFs.

These can then be used to more adequately implement censored distributions as described in #1867 and #1864 .

This implementation is based on the Cephes library's implementation by Steve
Moshier.
@domenzain
Copy link
Contributor Author

For some context, the implementation of all log CDF methods is in the cdf_methods branch can still be reviewed before merging into master.

'''Evaluates the continued fraction form of the incomplete Beta function.
Derived from implementation by Ali Shoaib (https://goo.gl/HxjIJx).
def incomplete_beta_cfe(a, b, x, small):
'''Incomplete beta continued fraction expansions
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does max iter matter?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe it is better to implement cfunc with custom op, scan is slow. Is it possible to have closed form gradient?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It only matters when there are too few steps for the continued fraction expansion to be close enough for the given float precision. Allowing for more steps is not useful.

Sorry, I'm not sure I understand what you mean by the gradient here.

In any case, the main issue with the previous (partial) implementation was that the numerical convergence was not very good.
This implementation is much better in that sense while being just as slow.

Of course it would be much faster to not do this symbolically...

@twiecki
Copy link
Member

twiecki commented Mar 20, 2018

@domenzain Can you add this to the release-notes? Would be great to get this merged.

@domenzain
Copy link
Contributor Author

Sorry, I should not have opened so many pull requests for these.
#2688 includes this same implementation of the incomplete beta function.

Point me where you'd like an explanation and I'll write a synthesis.

@fonnesbeck
Copy link
Member

Closing in favor of #2688

@fonnesbeck fonnesbeck closed this Sep 23, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants