Skip to content

400 error on get_block() call #292

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
julrach opened this issue Mar 4, 2021 · 13 comments
Open

400 error on get_block() call #292

julrach opened this issue Mar 4, 2021 · 13 comments

Comments

@julrach
Copy link

julrach commented Mar 4, 2021

When I am making a client.get_block(full_url) call, I am getting the below:

Got 400 error attempting to POST to loadPageChunk, with data: {
    "pageId": "***",
    "limit": 100000,
    "cursor": {
        "stack": []
    },
    "chunkNumber": 0,
    "verticalColumns": false
 }

[ERROR] HTTPError: Invalid input.
Traceback (most recent call last):  
  File "/var/task/get_kitchens_function.py", line 15, in lambda_handler    
    page = client.get_block(full_url)  
  File "/opt/python/lib/python3.7/site-packages/notion/client.py", line 169, in get_block    
    block = self.get_record_data("block", block_id, force_refresh=force_refresh)  
  File "/opt/python/lib/python3.7/site-packages/notion/client.py", line 162, in get_record_data    
    return self._store.get(table, id, force_refresh=force_refresh)  
  File "/opt/python/lib/python3.7/site-packages/notion/store.py", line 184, in get    
    self.call_load_page_chunk(id)  
  File "/opt/python/lib/python3.7/site-packages/notion/store.py", line 286, in call_load_page_chunk    
    recordmap = self._client.post("loadPageChunk", data).json()["recordMap"]  
  File "/opt/python/lib/python3.7/site-packages/notion/client.py", line 262, in post    
    "message", "There was an error (400) submitting the request."

I didn't change anything in my code and this just started to happen. I suspected it was a token_V2 problem but just checked and it isn't. I saw #11 from back then and looks like there might be an API structural change.

Would love to hear all of your thoughts! Trying to get this resolved because a product of mine depends on this! Thank you!

@nissy131
Copy link

nissy131 commented Mar 5, 2021

It seems that the Notion API has changed and a large limit get 400 error now.
I changed "limit": 100000 to "limit": 100 , and got 200.

@julrach
Copy link
Author

julrach commented Mar 5, 2021

It seems that the Notion API has changed and a large limit get 400 error now.
I changed "limit": 100000 to "limit": 100 , and got 200.

@nissy131 Ok so I need to do like client.get_block(full_url, limit=100) now?

@hellvesper
Copy link

It seems that the Notion API has changed and a large limit get 400 error now.
I changed "limit": 100000 to "limit": 100 , and got 200.

@nissy131 Ok so I need to do like client.get_block(full_url, limit=100) now?

I think you should do this in source

"limit": 100000,
Because this is hardcoded value

@julrach
Copy link
Author

julrach commented Mar 5, 2021

It seems that the Notion API has changed and a large limit get 400 error now.
I changed "limit": 100000 to "limit": 100 , and got 200.

@nissy131 Ok so I need to do like client.get_block(full_url, limit=100) now?

I think you should do this in source

"limit": 100000,

Because this is hardcoded value

Was just about to edit my reply. Probably could create a PR out of this. Is 100 the highest?

@danmcmahill
Copy link

Should this be a hard coded value or should there be a way to adjust early on in a program? And thanks for figuring this out. I hit the same thing with something that worked a few hours ago was no longer working.

@julrach
Copy link
Author

julrach commented Mar 5, 2021

Should this be a hard coded value or should there be a way to adjust early on in a program? And thanks for figuring this out. I hit the same thing with something that worked a few hours ago was no longer working.

I agree with this and also think that there is going to be a need for pagination as well since some things might have more than 100 blocks.

@nanpuhaha
Copy link

In my case...

Traceback (most recent call last):
  File "app.py", line 71, in <module>
    cvb = client.get_block(os.getenv('TABLE_URL'))
  File "/opt/hostedtoolcache/Python/3.8.8/x64/lib/python3.8/site-packages/notion/client.py", line 169, in get_block
    block = self.get_record_data("block", block_id, force_refresh=force_refresh)
  File "/opt/hostedtoolcache/Python/3.8.8/x64/lib/python3.8/site-packages/notion/client.py", line 162, in get_record_data
    return self._store.get(table, id, force_refresh=force_refresh)
  File "/opt/hostedtoolcache/Python/3.8.8/x64/lib/python3.8/site-packages/notion/store.py", line 184, in get
    self.call_load_page_chunk(id)
  File "/opt/hostedtoolcache/Python/3.8.8/x64/lib/python3.8/site-packages/notion/store.py", line 286, in call_load_page_chunk
    recordmap = self._client.post("loadPageChunk", data).json()["recordMap"]
  File "/opt/hostedtoolcache/Python/3.8.8/x64/lib/python3.8/site-packages/notion/client.py", line 260, in post
    raise HTTPError(
requests.exceptions.HTTPError: Invalid input.

@julrach
Copy link
Author

julrach commented Mar 5, 2021

Feel free to use #293 as a workaround for now.

@ttran
Copy link

ttran commented Mar 5, 2021

from tzlocal import get_localzone

import notion
def call_load_page_chunk(self, page_id):

    if self._client.in_transaction():
        self._pages_to_refresh.append(page_id)
        return

    data = {
        "pageId": page_id,
        "limit": 100,
        "cursor": {"stack": []},
        "chunkNumber": 0,
        "verticalColumns": False,
    }

    recordmap = self._client.post("loadPageChunk", data).json()["recordMap"]

    self.store_recordmap(recordmap)

def call_query_collection(
    self,
    collection_id,
    collection_view_id,
    search="",
    type="table",
    aggregate=[],
    aggregations=[],
    filter={},
    sort=[],
    calendar_by="",
    group_by="",
):

    assert not (
        aggregate and aggregations
    ), "Use only one of `aggregate` or `aggregations` (old vs new format)"

    # convert singletons into lists if needed
    if isinstance(aggregate, dict):
        aggregate = [aggregate]
    if isinstance(sort, dict):
        sort = [sort]

    data = {
        "collectionId": collection_id,
        "collectionViewId": collection_view_id,
        "loader": {
            "limit": 1000000,
            "loadContentCover": True,
            "searchQuery": search,
            "userLocale": "en",
            "userTimeZone": str(get_localzone()),
            "type": type,
        },
        "query": {
            "aggregate": aggregate,
            "aggregations": aggregations,
            "filter": filter,
            "sort": sort,
        },
    }

    response = self._client.post("queryCollection", data).json()

    self.store_recordmap(response["recordMap"])

    return response["result"]

def search_pages_with_parent(self, parent_id, search=""):
    data = {
        "query": search,
        "parentId": parent_id,
        "limit": 100,
        "spaceId": self.current_space.id,
    }
    response = self.post("searchPagesWithParent", data).json()
    self._store.store_recordmap(response["recordMap"])
    return response["results"]

notion.store.RecordStore.call_load_page_chunk = call_load_page_chunk
notion.store.RecordStore.call_query_collection = call_query_collection
notion.client.NotionClient.search_pages_with_parent = search_pages_with_parent

The very hot, hot-fix based on #293

Danappelxx added a commit to Danappelxx/notion-calendar that referenced this issue Mar 5, 2021
@Danappelxx
Copy link

@ttran that did the trick, thank you!

@nissy131
Copy link

nissy131 commented Mar 6, 2021

FYI, the API has been also changed so that requesting loadPageChunk multiple times will result in a 429 error(Too Many Requests).
In my case, I had to wait about a minute.

@fedeserbin
Copy link

fedeserbin commented Mar 10, 2021

Hi all! hope you are having a great day!

We encounter this problem also and its very urgent for us to solve this.

We were waiting a few days checking this thread expecting a new library version with this issue solved but it seems its still open.

How can we implement this workarround? We are using the library version 0.0.28. I'll be honest, I dont know where to put this code.

Thanks for any help you can give us.

jean added a commit to jean/notion-py that referenced this issue Mar 17, 2021
jean added a commit to jean/notion-py that referenced this issue Mar 17, 2021
jean added a commit to PyConTH/www2021 that referenced this issue Mar 17, 2021
@adosib
Copy link

adosib commented Apr 4, 2021

I got this error only on my second call to get_block() in the snipped below but, oddly, no error if I move this second call further down in my code.

Doesn't work ('HTTPError: Invalid Input' on second line)

reading_page_block = client.get_block(cfg['reading'])
reading_log = client.get_block(READING_LOG_ID) # child to reading_page_block

When I log the requests I see:

2021-04-04 13:42:16,414 DEBUG httplogger HTTP roundtrip
---------------- request ----------------
POST https://www.notion.so/api/v3/loadPageChunk
User-Agent: python-requests/2.25.1
Accept-Encoding: gzip, deflate
Accept: */*
Connection: keep-alive
Content-Length: 136
Content-Type: application/json

b'{"pageId": "b23c6a42-fdbc-4992-a6c0-8349caca6fa4", "limit": 100000, "cursor": {"stack": []}, "chunkNumber": 0, "verticalColumns": false}'
---------------- response ----------------
400 Bad Request https://www.notion.so/api/v3/loadPageChunk
Date: Sun, 04 Apr 2021 18:42:16 GMT
Content-Type: application/json; charset=utf-8
Content-Length: 102
Connection: keep-alive
Set-Cookie: [...]
X-DNS-Prefetch-Control: off
X-Frame-Options: SAMEORIGIN
Strict-Transport-Security: max-age=5184000; includeSubDomains
X-Download-Options: noopen
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Referrer-Policy: same-origin
Content-Security-Policy: script-src 'self' 'unsafe-inline' 'unsafe-eval' https://gist.github.com https://apis.google.com https://api.amplitude.com https://widget.intercom.io https://js.intercomcdn.com https://logs-01.loggly.com https://cdn.segment.com https://analytics.pgncs.notion.so https://checkout.stripe.com https://js.stripe.com/v3 https://embed.typeform.com https://admin.typeform.com https://platform.twitter.com https://cdn.syndication.twimg.com https://www.googletagmanager.com https://x.clearbitjs.com; connect-src 'self' https://msgstore.www.notion.so wss://msgstore.www.notion.so ws://localhost:* https://notion-emojis.s3-us-west-2.amazonaws.com https://s3-us-west-2.amazonaws.com https://s3.us-west-2.amazonaws.com https://notion-production-snapshots-2.s3.us-west-2.amazonaws.com https: http: https://api.amplitude.com https://api.embed.ly https://js.intercomcdn.com https://api-iam.intercom.io wss://nexus-websocket-a.intercom.io https://logs-01.loggly.com https://api.segment.io https://api.pgncs.notion.so https://checkout.stripe.com https://js.stripe.com/v3 https://cdn.contentful.com https://preview.contentful.com https://images.ctfassets.net https://api.unsplash.com https://boards-api.greenhouse.io; font-src 'self' data: https://cdnjs.cloudflare.com https://js.intercomcdn.com; img-src 'self' data: blob: https: https://platform.twitter.com https://syndication.twitter.com https://pbs.twimg.com https://ton.twimg.com www.googletagmanager.com; style-src 'self' 'unsafe-inline' https://cdnjs.cloudflare.com https://github.githubassets.com https://platform.twitter.com https://ton.twimg.com; frame-src https: http:; media-src https: http:
X-Content-Security-Policy: script-src 'self' 'unsafe-inline' 'unsafe-eval' https://gist.github.com https://apis.google.com https://api.amplitude.com https://widget.intercom.io https://js.intercomcdn.com https://logs-01.loggly.com https://cdn.segment.com https://analytics.pgncs.notion.so https://checkout.stripe.com https://js.stripe.com/v3 https://embed.typeform.com https://admin.typeform.com https://platform.twitter.com https://cdn.syndication.twimg.com https://www.googletagmanager.com https://x.clearbitjs.com; connect-src 'self' https://msgstore.www.notion.so wss://msgstore.www.notion.so ws://localhost:* https://notion-emojis.s3-us-west-2.amazonaws.com https://s3-us-west-2.amazonaws.com https://s3.us-west-2.amazonaws.com https://notion-production-snapshots-2.s3.us-west-2.amazonaws.com https: http: https://api.amplitude.com https://api.embed.ly https://js.intercomcdn.com https://api-iam.intercom.io wss://nexus-websocket-a.intercom.io https://logs-01.loggly.com https://api.segment.io https://api.pgncs.notion.so https://checkout.stripe.com https://js.stripe.com/v3 https://cdn.contentful.com https://preview.contentful.com https://images.ctfassets.net https://api.unsplash.com https://boards-api.greenhouse.io; font-src 'self' data: https://cdnjs.cloudflare.com https://js.intercomcdn.com; img-src 'self' data: blob: https: https://platform.twitter.com https://syndication.twitter.com https://pbs.twimg.com https://ton.twimg.com www.googletagmanager.com; style-src 'self' 'unsafe-inline' https://cdnjs.cloudflare.com https://github.githubassets.com https://platform.twitter.com https://ton.twimg.com; frame-src https: http:; media-src https: http:
X-WebKit-CSP: script-src 'self' 'unsafe-inline' 'unsafe-eval' https://gist.github.com https://apis.google.com https://api.amplitude.com https://widget.intercom.io https://js.intercomcdn.com https://logs-01.loggly.com https://cdn.segment.com https://analytics.pgncs.notion.so https://checkout.stripe.com https://js.stripe.com/v3 https://embed.typeform.com https://admin.typeform.com https://platform.twitter.com https://cdn.syndication.twimg.com https://www.googletagmanager.com https://x.clearbitjs.com; connect-src 'self' https://msgstore.www.notion.so wss://msgstore.www.notion.so ws://localhost:* https://notion-emojis.s3-us-west-2.amazonaws.com https://s3-us-west-2.amazonaws.com https://s3.us-west-2.amazonaws.com https://notion-production-snapshots-2.s3.us-west-2.amazonaws.com https: http: https://api.amplitude.com https://api.embed.ly https://js.intercomcdn.com https://api-iam.intercom.io wss://nexus-websocket-a.intercom.io https://logs-01.loggly.com https://api.segment.io https://api.pgncs.notion.so https://checkout.stripe.com https://js.stripe.com/v3 https://cdn.contentful.com https://preview.contentful.com https://images.ctfassets.net https://api.unsplash.com https://boards-api.greenhouse.io; font-src 'self' data: https://cdnjs.cloudflare.com https://js.intercomcdn.com; img-src 'self' data: blob: https: https://platform.twitter.com https://syndication.twitter.com https://pbs.twimg.com https://ton.twimg.com www.googletagmanager.com; style-src 'self' 'unsafe-inline' https://cdnjs.cloudflare.com https://github.githubassets.com https://platform.twitter.com https://ton.twimg.com; frame-src https: http:; media-src https: http:
ETag: W/"66-sPnSEatuJIzcBqI+IwSToi2JcCw"
Vary: Accept-Encoding
CF-Cache-Status: DYNAMIC
cf-request-id: 093fca6ce90000f253180fe000000001
Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
Server: cloudflare
CF-RAY: 63acacf4ac4cf253-ORD

{"errorId":"bddca7f1-97a3-472c-97ba-7a348c184a68","name":"ValidationError","message":"Invalid input."}

But I don't have this issue when I run

  reading_page_block = client.get_block(cfg['reading'])

  books = {
      'name': [],
      'identifier': [],
      'pages_read': [],
      'timestamp': []
  }

  timestamp = datetime.now()

  # for each book or article, capture the desired fields
  for block in reading_page_block.children:
      if block.id not in [ARTICLES_ID, BOOKS_ID]:
          continue
      else:
          for row_block in block.collection.get_rows():
              if row_block.title:
                  books['name'].append(row_block.title)
                  try:
                      books['pages_read'].append(row_block.page_number or 0)
                  except AttributeError:
                      # articles don't have the pages_read property
                      if row_block.status and row_block.status.lower()=="read":
                          books['pages_read'].append(1)
                      else:
                          books['pages_read'].append(0)
                  books['identifier'].append(row_block.id)
                  books['timestamp'].append(timestamp) 
  
  reading_log = client.get_block(READING_LOG_ID) # child to reading_page_block

Maybe I should also note that I'll get the same error upon entering the for loop above if I use a subclass of NotionClient with a modified constructor that just sets self.session to enable logging to stdout - I find this behavior strange as well. I'll attach this for good measure.
NotionClientSuper.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants