@Nicholas McFadden has joined the channel
@deactivateduser has joined the channel
Tort Intake Professionals has joined this channel by invitation from Shield Legal.
Is it too early to add our signature hashtags🧐 @deleted-U05QUSWUJA2
#AI #HumansAreNotFood
set the channel description: To discuss the progress and issues related to automating reports and the visualization of them in Looker
set the channel topic: #AI #HumansAreNotFood
I was a bit concerned that aidan was not able to make it to our weekly meetings when we told us @deleted-U05QUSWUJA2 he was able to make it 😆
@Nicholas McFadden would you be able to give us a bit of a crash course in the Looker Studio stuff either during the meeting on Thursday or in a separate meeting? We are excited to put it to good use!
Yes that was my intention. I believe we recorded the training session we had so I will look for that and send it to yall
@here I have setup access for all of you. Please go to the link below and confirm you can login and see the BigQuery datasets/tables. https://console.cloud.google.com/bigquery?project=tort-intake-professionals&ws=!1m0
This is your JSON key file for GCP. Your Service Account is: tipreportingteam_sa
@Nicholas McFadden Whenever you get the chance, may you grant Aidan and I access to this (The screenshot):
I am creating a python script that will look for a specific email subject line from my end (it will be a schedule report from law ruler) and I believe I need access to this in order for it to work based on my research. Looks like google cloud functions uses a service account for authentication; while if I had ran the script local (which I am trying to use google cloud function instead) it could use OAuth 2.0 instead.
That is the error I get when trying to enable permission for Gmail API. With that, I am going to obtain a .json file which would give me the necessary permissions to pull from my gmail for the python script
Law Ruler API: https://apim.lawruler.com/tortintakeprofessionals/swagger/ui/index
def getauth(uname, upass, cid, csecret): ## Authorization endpoint authurl = 'https://auth.lawruler.com/tortintakeprofessionals/identity/connect/token'
## Pass credentials
body = {'grant_type': 'password',
'username': u_name,
'password': u_pass,
'scope': 'openid profile read write offline_access',
'client_id': c_id,
'client_secret': c_secret
}
headers = {'Content-Type': 'application/x-www-form-urlencoded'}
# Execute authorization token request
auth_resp = <a href="http://requests.post">requests.post</a>(auth_url, data=body, headers=headers)
# Store Token Info
auth_data = auth_resp.json()
try:
a_token = auth_data['access_token']
token_type = auth_data['token_type']
r_token = auth_data['refresh_token']
return a_token
except KeyError as e:
print(f"Error retrieving access token: {e}")
return None
def getleadcount(a_token):
# 'GetInboxItems' endpoint used:
url = "<https://apim.lawruler.com/tortintakeprofessionals/ApiCases/GetInboxItems>"
headers = {
'Content-Type': 'application/json',
'Authorization': f'Bearer {a_token}'
}
params = {
"inboxType": 0,
"page": 1,
"pageSize": 1
}
response = requests.request("GET", url, params=params, headers=headers)
# Retrieve json response
try:
json_resp = response.json()
lead_count = json_resp.get('Count')
end_page = math.ceil(lead_count / 1000)
return end_page
except (KeyError, json.JSONDecodeError) as e:
print(f"Error retrieving lead count: {e}")
return None
def main(): # Define resources and global variables cid = "tortintakeprofessionals.client.resource.owner" csecret = "3RsxCggviuYBdBZQ15DT1faJFOPRJLMG" uname = "nickmcfadden@tortintakeprofessionals.com" upass = "tip123"
try:
a_token = get_auth(u_name, u_pass, c_id, c_secret)
if not a_token:
raise Exception("Failed to retrieve access token")
end_page = get_lead_count(a_token)
if not end_page:
raise Exception("Failed to retrieve lead count")
links_df = asyncio.run(get_links(a_token, end_page)) # Store DataFrame directly
print(links_df.shape) # Print the shape of the DataFrame
print(links_df.head()) # Print a preview of the DataFrame
except Exception as e:
print(f"An error occurred: {e}")
@Nicholas McFadden Just want to confirm if we are still good for our 12 meeting
Going to need to cancel this weeks meeting sorry
https://learn.microsoft.com/en-us/power-automate/overview-solution-flows
https://developers.google.com/gmail/api/reference/rest/v1/users/watch
Troubleshooting info: Principal: edward@tortintakeprofessionals.com Resource: tort-intake-professionals Troubleshooting URL: http://console.cloud.google.com/iam-admin/troubleshooter;permissions=monitoring.timeSeries.list;principal=edward@tortintakeprofessionals.com;resources=%2F%2Fcloudresourcemanager.googleapis.com%2Fprojects%2Ftort-intake-professionals/result|console.cloud.google.com/iam-admin/troubleshooter;permissions=monitoring.timeSeries.list;principal=edward@tortintakeprofessionals.com;resources=%2F%2Fcloudresourcemanager.googleapis.com%2Fprojects%2Ftort-intake-professionals/result
Missing permissions: monitoring.timeSeries.list
@Nicholas McFadden
Below are the Shield Legal SFTP login info.
sftp://dicellolevitt.files.com Username: Shieldlegal Port: 22 Password: #Shi3LdL3g2l!
Flatirons benchmark script has been updated so you can run your script now if needed @Nicholas McFadden for the financials
This is what I had prior when doing the Benchmark Dashboard, but this is in PowerBI - but I'll work on this in Looker more, even though Looker is ugly
@Nicholas McFadden When you get a chance, can you show us how to add data from the benchmark report to looker? We need currently it's only pulling data from the first tab of the Benchmark report, and we need it to pull data from the All Data tab preferably, as all of the data it's currently pulling is just the labels from the first sheet.
I set it up on the all_data tab
are you free to hop on a gmeet?
hey @Nicholas McFadden, yeah we're totally good now, I got it all figured out now, that's my bad
You have to set up the connection within BQ
I'm working on getting the fields/functions ready for the CPAP campaign, but i'll let you know if I run into any other roadblocks, but I saw the Alldatatab information
Like under that G sheet data set try adding a new table and pick drive as the source and then put type as G sheet and put name of the sheet you want it to pull
we're golden, I thought we didn't have the all data sheet, but we do, so that solves all the problems I was having thankfully :thehorns::skintone_4:
You’ll need the URI which is the link of the sheet but without the sharing stuff at the end of the url