APIv2 takes much longer to retrieve data (compared to APIv1)

So I’ve been trying to upgrade my script from APIv1 to APIv2 (patreon-php), but I noticed that APIv2 is much slower when it comes to retrieving user data compared to APIv1. It takes hours to get user data of 1000+ patrons, compared to 2/3 minutes with APIv1. Is this a known issue?

The reason I retrieve all user data at once is, because the script runs a cronjob every X minutes to make sure users are still active. Having to run the cronjob every few hours would be rather problematic (as I would have to force a short login session).

There may be issues related to the way you set scopes and includes however it is hard to tell like this.

You are using a v2 client while using v2 endpoints right? And with v2 calls? What are the calls which you are using?

Hi codebard! Thank you so much for your reply.
I don’t know what this forum would do without you!

Yes, to both your questions.

It’s difficult to figure out what exactly is going on as I didn’t create our API script myself (it was outsourced), but I’ve been testing with Insomnia and I think the problem is that using APIv1 you can get all the user data immediately with a single query, while with APIv2 additional queries have to be performed for each user to get the necessary data. Here’s a part of the APIv2 script (with the call):

    public function fetch_page_of_members_from_campaign($campaign_id, $page_size, $cursor = null) {
    $url = "campaigns/{$campaign_id}/members?page".urlencode('[count]')."={$page_size}";
    if ($cursor != null) {
        $escaped_cursor = urlencode($cursor);
        $url = $url . "&page".urlencode('[cursor]')."={$escaped_cursor}";
    }
    return $this->get_data($url."&include=currently_entitled_tiers,address&fields".urlencode('[member]')."=full_name,is_follower,last_charge_date,last_charge_status,lifetime_support_cents,currently_entitled_amount_cents,patron_status&fields".urlencode('[tier]')."=amount_cents,created_at,description,discord_role_ids,edited_at,patron_count,published,published_at,requires_shipping,title,url");
}

public function fetch_member_details($member_id) {
    return $this->get_data("members/{$member_id}?include=address,campaign,user,currently_entitled_tiers&fields".urlencode('[member]')."=full_name,is_follower,email,last_charge_date,last_charge_status,lifetime_support_cents,patron_status,currently_entitled_amount_cents,pledge_relationship_start,will_pay_amount_cents&fields".urlencode('[tier]')."=title&fields".urlencode('[user]')."=full_name,hide_pledges");
}

It still shouldn’t take hours though to go through 1000+ users.

The code example seems to be getting a full page of users with their entire details using an endpoint which is designed to get a member list page. If page size is X, this would mean X times the member data.

A more efficient way to do this would be to only get the member list for a page with only member ids (or better yet, with the default call), and then iterate over them in your code by using fetch_member_details function to get their details.

You seem to be asking some additional details in your includes in fetch_page_of_members_from_campaign. You can just copy the fetch_member_details function to fetch_member_details_custom etc, and then copy over the fields you require to its call.

This is as much as i can comment without seeing more of your code…