all,
I have a script I am building to find all open pull requests and compare the sha hash, however I can't seem to find them....
for repo in g.get_user().get_repos():
print (repo.full_name)
json_pulls = requests.get('https://api.github.com/repos/' + repo.full_name + '/pulls?state=open+updated=<' + str(cutoff_date.date())+ '&sort=created&order=asc')
if (json_pulls.ok):
for item in json_pulls.json():
for c in item.items():
#print(c["0"]["title"])
#print (json.dumps(state))
print(c)
The code cycles through the existing repos and list the pull requests and I get the output:
But, I can't for the life of me figure out how to collect individual fields...
I tried using the references:
print(c['title'])- not defined is the errorprint(c['0']['title'])-a tubular error
what I am looking for is a simple list for each request....
title
id
state
base / sha
head / sha
Can someone please point out what I am doing wrong in referencing the json items in my python script as it is driving me crazy.
The full code as is with your help of course ... :
# py -m pip install <module> to install the imported modules below.
#
#
# Import stuff
from github import Github
from datetime import datetime, timedelta
import requests
import json
import simplejson
#
#
#declare stuff
# set the past days to search in the range
PAST = 5
# get the cut off date for the repos 10 days ago
cutoff_date = datetime.now() - timedelta(days=PAST)
#print (cutoff_date.date())
# Repo oauth key for my repo
OAUTH_KEY = "(get from github personal keys)"
# set base URL for API query
BASE_URL = 'https://api.github.com/repos/'
#
#
# BEGIN STUFF
# First create a Github instance:
g = Github(login_or_token=OAUTH_KEY, per_page=100)
# get all repositories for my account that are open and updated in the last
no. of days....
for repo in g.get_user().get_repos():
print (repo.full_name)
json_pulls = requests.get('https://api.github.com/repos/' + repo.full_name
+ '/pulls?state=open+updated=<' + str(cutoff_date.date())+
'&sort=created&order=asc')
if (json_pulls.ok):
for item in json_pulls.json():
print(item['title'], item['id'], item['state'], item['base']['sha'],
item['head']['sha'])
The repo site is a simple site, with two repos, and 1 or 2 pull requests to play against.
The idea of the script, when it is done, it to cycle through all the repos, find the pull requests that are older than x days and open, locates the sha for the branch (and sha for the master branches, to skip..... ) remove the branches that are not master branches, thus removing old code and pull requests to keep the repos tidy....

item["title"], etc...cutoff_date.date(). Why not just post a url we can test?