https://console.cloud.google.com/iam-admin/iam?project=tort-intake-professionals
def getlrauth(cid, csecret,uname,upass):
auth_url = '<https://auth.lawruler.com/tortintakeprofessionals/identity/connect/token>'
## Pass credentials
body = {'grant_type': 'password',
'username': u_name,
'password': u_pass,
'scope': 'openid profile read write offline_access',
'client_id': c_id,
'client_secret': c_secret
}
headers = {'Content-Type': 'application/x-www-form-urlencoded'}
# Execute authorization token request
auth_resp = <a href="http://requests.post">requests.post</a>(auth_url,data=body,headers=headers)
# Store Token Info
auth_data = auth_resp.json()
a_token = auth_data['access_token']
token_type = auth_data['token_type']
r_token = auth_data['refresh_token']
return a_token
LRID ="tortintakeprofessionals.client.resource.owner" LRSECRET= "3RsxCggviuYBdBZQ15DT1faJFOPRJLMG" LRUSERNAME= "nickmcfadden@tortintakeprofessionals.com" LRPASSWORD= "tip123" PROJECT_ID= "tort-intake-professionals"
I asked Ryan for access to GitHub and he said for you to confirm which projects.
https://themedialaboratory.slack.com/archives/C01EQF51UTC/p1726936480164979 Does this mean that the campaign will require an integration?
Hey not feeling well. Headed home. If you need me text or call me. I'll try to be on when I can
Sorry just saw this no I was on for it
@Dustin Surwill can you get me an update on everything by 3pm?
Want me to try converting NEC to GCP?
Yes because I can get it to work on Postman so should work fine with the new system
https://themedialaboratory.slack.com/archives/C05GHTD7M24/p1728060253520829 I changed the LR automation to go to GCP on re-trigger and triggered 600094 - Nathen Borrero, yesterday with no document upload failures. Should we swap signed e-sign final to GCP?
wwas this the similar fix you had with vga to now get the docs going through
The same thing. All I did was submit via python
Was this a production issue or was this a testing issue?
MPA unable to parse the docs to upload
Are we telling them when we are testing and what is sent over?
I did not have contact information until eod yesterday. Will send email including new re-trigger. MPA issue seems to be a general error that recently started across multiple flows. Seems to be any flow that sends to SmartAdvocate.
also are these failing and are we receiving logs/notifications?
in MPA these are failing but take about 10 hours each and notify via slack
then the team needs to be watching for those failures and at least throwing a hand up
if this has started across multiple flows which ones are those and lets make those the top priority for moving to GCP
Will get you that list soon. How do we set up a schedule to for pub/sub to trigger the main gcp function?
there is Cloud Scheduler and then you can have it trigger a pub/sub message to send on the schedule thus triggering the rest
Hey not feeling great. Trust you to handle anything for the rest of today.
Have James handle any communication needs and then just keep on track
To update you. 9 of the 13 on the sheet and nec wagsatff are made for gcp (mapping and code). Need to deploy as functions then test
Can I get Oct 25 off?
Yeah shouldn't be a problem
def extract_sig(FILE: str):
pdf_document = fitz.open(FILE)
for i, page in enumerate(pdf_document.pages()):
if i != 4:
continue
data = page.get_text('dict')['blocks']
image_data = [_ for _ in data if _.get('ext') == 'png']
for j, image in enumerate(image_data):
with open(f'{FILE}-{i}-{j}.{image["ext"]}', 'wb') as f:
f.write(image['image'])
Key=6084A579C6CA45059910B47CE286B
Creating a Lead/Intake via the Legal CRM API Want to create a New Lead in Legal CRM or Update the data to an Existing Lead? Legal CRM uses the term “Lead” and is meant to refer to either a Lead or Intake.
Creating a Lead Sample of a POST link with all of the fields, use the ones you would like to try: (This is a URL-Encoded example for visual purposes, we highly recommend POSTING all these data points using form-data as individual post parameters instead.)
https://sample.yoursiteURL.com/api-legalcrmapp.aspx?FirstName=John&LastName=Smith &Address1=123 AnystreetCity=WilkesBarre&State=PA&Zip=12345&CellPhone=570-123-7899&HomePhone=570-123-7899&Email1=test@testing.com&Summary=His vehicle was hit by a drunk driver&CaseType=Auto Accident&LeadProvider=Google&Hear=Live Chat&Key=Y8d7cMkSLGhONnZyjzVAVbhGV9eXX8
FYI
https://stackoverflow.com/questions/2498875/how-to-invert-colors-of-image-with-pil-python-imaging
https://pymupdf.readthedocs.io/en/latest/recipes-images.html
Does this work, or do I need to find and add the reason?
from concurrent.futures import ThreadPoolExecutor
with ThreadPoolExecutor() as executor:
executor.submit(run, **args, capture_output=True, shell=True)
I found an issue with the way I was doing the threadpool and had to resort to the following:
from concurrent.futures import ThreadPoolExecutor, as_completed
with ThreadPoolExecutor() as executor:
threads = []
for args in range(10): # replace with your iterable
threads.append(executor.submit(run, args, capture_output=True, shell=True))
_ = [_.result() for _ in as_completed(threads)] # as_completed will block until all threads are complete
lrinboxdump gcloud function complete:
https://github.com/shield-legal/lawruler-ui
Can trigger manually via powershell:
curl -H @{"Authorization"= "Bearer $(gcloud auth print-identity-token)"} <https://lawruler-ui-144747688343.us-west4.run.app/lr_inbox/to_bq/>
or in the repo
./utils.ps1 test-bq
it takes about 5 minutes to run. there are 3 runs in bq currently (if you delete the table, it will auto re-create) https://console.cloud.google.com/bigquery?project=tort-intake-professionals&ws=!1m5!1m4!4m3!1stort-intake-professionals!2slr_data!3slr_inbox
In order to update:
./utils.ps1 build (with docker installed)gcloud auth configure-docker <a href="http://us-west4-docker.pkg.dev">us-west4-docker.pkg.dev</a>./utils.ps1 push (with docker installed)<a href="http://us-west4-docker.pkg.dev/tort-intake-professionals/custom-docker">us-west4-docker.pkg.dev/tort-intake-professionals/custom-docker</a>
c. under lawruler-ui
d. select image with latest tagTook me about 2 hours Friday and 5 hours today (sunday, with distractions and only 1 monitor, should have been 2 or 3 hours on sunday)
Hey ryan is trying to start using some of the lrinbox data but becasue lrdata.lr_inbox data in BigQuery is set to a Data Location of US-WEST4 instead of just US multi-region he cannot create his model in DBT with that being the only dataset not in the same regjion. He wants to know if you can update that?
should I be using US multi-region for everything instead of US-WEST4 when I can?
I guess so if it is going to mess up their scritps
Does this also mean I should toss everything in the "Tort Intake Professionals" project?
might as well
That means a new cloudsql instance for that project...
I’m fine keeping it separate if it doesn’t hinder us going cross project. Otherwise maybe we need to start cloud sql instance in TIP, migrate over data from Integrations and delete old projects
cloudsql instance in the tort-intake-professionals project. to prevent the issues of going cross-project
I also found a way to make cloudsql cheaper
Alan reached out with the following:
We just got off a call with Meadow regarding VGA - Julia mentioned that she received an email automation of a test lead in a closed case type Test Test Gaming Addiction - Meadow Law - Ghozland (TIP) - Shield Legal Test Lead Dispos Created: Jul 2, 2024 2:06 PM Lead # 550520 [3:35 PM] Alan Gill
Also Ashley mentioned that many of the retainers that were uploaded into Smart Advocate are not showing in the correct document section causing an error based on NO RETAINER. She mentioned that she was working with you on this and sent an email but haven't heard back . Thanks
Was probably Ahsan doing a test for that case_type. He did not do it through GCP so I have 0 logs. I will talk to him about it tomorrow. Alex and Britney asked us about it as well
We do not have logs that mention missing docs. Since everything is already pushed we can only re-send docs without causing a duplicate
Any comments on the baypoint doc before I send it to Cam?
https://tortintakeprofessionals.monday.com/boards/6304263960/pulses/7828891879
My ETA calculation was not taking into account the number of accounts. With 8 accounts it says about 24 hours:
Would this be of any help? it was in the github: https://github.com/shield-legal/leadspedia/blob/main/leadspedia/leadspedia.py last update Jan 2022
Just a heads up, we will have to do a full audit of Wagstaff tomorrow morning for Cam
Saying there are differences on count between us and them
So we will just have to go through it with Tony tomorrow morning
David Warren Douglas Lang Samuel Calvary
David Lee Warren 564377: never sent via automation only to patsy and nick m on 10/3/24 Douglas James Langer Sr 586064: currently in QA wip Samuel Gene Calvary 576109: sent via MPA 9/3/24
MPA message for Calvary: https://themedialaboratory.slack.com/archives/C04A8RLMASJ/p1725408037413959
```class CustomThreadPoolExecutor(ThreadPoolExecutor): def init(self, args, kwargs): super().init(args, kwargs) self.tasks = [] self.logger = logging.getLogger(self.class.name) self._lock = Lock()
def submit(self, fn: Callable, /, **args, ****kwargs):
task = super().submit(fn, **args, ****kwargs)
task.func = fn.__name__
task.args = args
task.kwargs = kwargs
with self._lock:
self.tasks.append(task)
return task
def remove_done(self):
not_done = []
with self._lock:
for _ in self.tasks:
if not _.done():
not_done.append(_)
else:
_.result()
self.tasks = not_done
def shutdown(self, wait: bool = True, **, cancel_futures: bool = False, await_tasks: bool = True):
if sys.exc_info()[0]:
await_tasks = False
if await_tasks and wait and not cancel_futures:
<a href="http://self.logger.info">self.logger.info</a>(f'Waiting for all tasks ({len(self.tasks)}) to complete before shutdown...')
while self.tasks:
wait_for_futures(self.tasks, self.logger)
self.remove_done()
super().shutdown(wait, cancel_futures=cancel_futures)```
https://cloud.google.com/sdk/docs/install https://cli.github.com/
We should send our notes/comments to Joe. Do you want me to do that?
Dang I forgot about that yes send it to everyone except the lawruler people
These 2 clients did not post into the Bay point system. They are asking to have them reposted into there system. Jameca Turner Virginia Cassady
But we addressed this yesterday correct? That their system timed out for the docs?
No I saw you responded to cam saying it was done
Just Tony reached out to me so at first I thought it was something else but sound like he just hadn’t gotten to it or something. Either way told him it was handled
Is it me or does it seem like anything Craig is involved with ends up with Cam messaging us?
Feel like Craig doesn’t need to be part of these discussions and since he doesn’t know anything ends up making things more complicated
I talked to @Malissa about that yesterday. We made it clear to Craig that she is the point of contact for the team
How real time should the lr sync be? I think I can get it to run faster than every 30 minutes with only 1 account
With whatever we are doing let’s ease into it. Slowly stress test the system
stress test which system? GCP or LR? I think we have already stress tested LR with 8 accounts concurrent for 56 hours (last weekend). GCP seems much faster than my laptop. My laptop takes 6 minutes to update the last 2 hours worth of lr data with 1 account whereas GCP takes 1 minute to update the last hour with 1 account.
By the way, the dump is now in the cloud set to run every 15 minutes.
Ok ya I was saying for the LR side to set only 15min pulls. I just don’t want to overload anything out the gate is all.
Regarding Ryan’s request do you have a could handle on how to get him those numbers?
Yes, just figured he would request it from you and you would request from me
Oh ya he did not so thanks for the heads up
How’s everything going over there today?
ACTS did not give us mappings yesterday but they said they would send them by the end of next week
Ok and did the Legifiy people send the mappings?
No, we will be reaching out about that
Let’s plan to check in around Wednesday to see where ACTS is at
Have you seen that there might be some built in CDC functionality with Cloud SQL? I am starting to look into it. May need to build our own to manage data size or just do overwrites and "snapshots" saved to Cloud Storage
CDC is not available for postgres only microsoft sql server:
Looks like Datastream can replace fivetran
I am wondering if we implement the indexing and partitioning maybe we should reconsider how should we split between postgres and bigquery.
we can discuss more next week
Tony does not want ASA to change the status. so the retriggering will be on us
Have Daniel track itt
Here is the audit of the campaign webhooks that Dwight did: https://docs.google.com/spreadsheets/d/1EsvJD9oOyuELNuHR1hebNECVtRKrvdsHaqbE9sypXW0/edit?gid=1555039326#gid=1555039326
We added a column for the monday status of the gcp integrations which tells us if the automation was deleted or never created
Before I go ask her can you double check to make sure it hasn’t been corrected?
What is the status of the approval for sending decline reasons to simons for bard powerport?
or did we decide that simmons does not get the decline reason? I remember after the meeting with simmons that they wanted the decline reason but you said that we would need Cams approval to send the reasoning
I have been waiting before sending a test with other changes
I didn’t get a chance to talk to Cam about it.
I believe Tony acknowledged needing to do it for this client but didn’t want to try to do it for all of them. Ask Tony about it and when we can expect them to be completed so we can send those leads
Hey I’ll need you to run point on getting the “inventory” of what we are sending for integrations and for you to go around to see what the other departments are doing. I will talk with Ryan
Should I wait until you are available to address Ahsan's dark humor and blame of other systems? Other people are starting to repeat some of those statements
Feel free to take a crack at it and I’ll follow up when I’m back
Also if we have a plan forward let everyone know Mal, etc
Meadow is harassing Alex about needing the signed and declined via api. Are we good to send the backfill of signed & declines to Meadow for VGA (133)? Or do we need approval from Cam & Tony?
No decline reasoning just that it was declined
I thought that is what they requested? To have the reasoning? That’s why we needed approval for, just to have a field that we overwrite”Declined” is fine to go ahead with
We needed approval for Simmons since they needed the reasoning but Meadow just wants them marked and in their system via API
Then that’s fine to go ahead for Meadow
This campaign the same leads going to Simmons and Meadow for VGA?
No, Bard Powerport for Simmons and Gaming Addition for Meadow
Sorry driving and trying to do this is not the best
Ok let’s come up with a couple standardized reasonings for a dropdown
Things like: insufficient docs, opt out, didn’t meet timeline, not sufficient exposure, etc
Whatever makes sense for the case
If we have to we can just tell them on anything currently sent will be a hard code of “Declined” like Meadow until we can establish the categories and process
Maybe throw it back to them and ask if they have any input on categories
Cam and I will have a future convo with Tony about establishing these categories long term but he said not now
can I get the Connex DB information? @deleted-U06GD22CLDC said you could provide it
Ya give me a sec and I’ll send it to you
Need anything from me? How did the convo with Ahsan go?
Ya good man got to see family I haven’t seen in years
convo went good, think I have seen a slight change.
More memory and permission to do another LR backfill to fix bugs in the data
Backfilling just discuss with other apartments and let Joe know
or a better laptop. I have a local version of postgres running for a report I helping David with and it takes forever (>20 minutes) to generate from the data (some of the data is in the BQ, so i copied it locally to reduce cost and allow joins)
He is hoping to get you some better stuff by next week
Available for quick meeting with Joe?
Should I change the database dump to only run at 10PM? instead of every 30 minutes? since I think the database dump is the worst offender
I see no data in Shield Legal BI / BQ / leadspedia / all_leads
Do you want to push it postgres instead? We can have a meeting where I show you how to access postgres. It is easy with PyCharm
if the 2PM meeting is no longer needed can we cancel it?
Yes I was just sending a note about it
let the team know
Should i give integrations permission on the project with the lr data for bigquery / postgres?
Should I change the database dump to only run at 10PM? instead of every 30 minutes?
Have the guys help out Brittany or anyone else to get the data they need
Ahsan supposedly has bandwidth so make sure he is helping out
If Brittany's team had a google group (dl) like we have for integrations it would simplify the permissions in GCP
Is it a group but we can still see individual accounts?
Just don’t want the issue of not knowing who did something
I think it is more role based access, you are in the group so you get access. I do not know how it logs, I do not find many user based logs in GCP. I feel like google would do it correctly and log the user not the group
Ok so then yes give them a group
We do not have access. It is a Ryan or Mike Everhart thing
fivetran is syncing the ftlrcasetypes-list gcp function every minute. I assume to copy lr case types from lr to bq. we should turn it off and change anything that was using it to use the lr-data dataset in bq
Found that pulling from the activity log to get the questions and answers per lead does not work since I have to match on the lawruler field (c-###) but that can be on the lead once per tab (General Questions, Screener, ...) which is a problem since the questions are duplicated between the general questions and screener tabs. The only way I think to resolve this is to use the API since it returns the question id (unique per question no matter the tab). For 1 lead it took about 2 minutes via the API but only 2 seconds via the UI API. We will either have to run a fill 10pm - 4am pst or ask them for the data. A fill with the API on 1 thread will take >2 years 24/7 and with the UI API on 1 thread with take >14 days
FYI: Me and @Ahsan have been in a slack huddle and code with me (to my pycharm) since 1:30 PM
And for the last 2 hours my laptop has been at 100% CPU slowing down everything
Is it really that much to run this each time? And is there anything we can really do right now?
It takes approx 5 minutes to run the fl_test.py and less than 30 seconds for the rewrite
I prefer automations / logic to AI for something like that
Hillel mentioned needing an integration with Legafi
We only made the connection with Wagstaff correct?
Correct. I do see anything in the chain from Hillel that mentions integration with LegaFi only email
I don’t see anything specifically so I was going to ask Can
Wait you do or you don’t see anything about Legafi direct?
Columns currently in use from JP's report: Case Type Client Phone Client Name Client Mailing Street Case Number Decedent Rejection Reason Medical Records Ordered (Y/N) Latest Medical Request Date Medical Records Received (Y/N) Latest Medical Received Date
https://cloud.google.com/python/docs/reference/bigframes/latest
https://cloud.google.com/docs/authentication/set-up-adc-local-dev-environment
https://cloud.google.com/sql/docs/postgres/sql-proxy#windows-64-bit
example:
cloud-sql-proxy.exe --port [65000>port number>1000 standard is 3306] [gcp project name, for lr data: tort-intake-professionals]:us_west4:[gcp postgres instance name, for lr data: lr-data] --auto-iam-authn
for tort:
cloud-sql-proxy.exe --port 3307 tort-intake-professionals:us_west4:lr-data --auto-iam-authn
With 2/3/2025 selected and a filter to ignore all signed cases >= date selected (2/3/2025)
What sheet in the financials does the Hair Salon Bladder Cancer - DL - Flatirons - Shield Legal belong on? Or should I make a new tab
without the extra matches from last night since they are still being verified
I wonder if GCP is throttling me. I am only sending at 1.3Mbps
can you up the resources on the gcp side ?
It errored after 1.6M/~7M inserts with a network error will try again
No billable signed leads found in the database for the following case_types • 138 • 139 • 148 • 151 • 189 • 193 • 196 • 198 • 208 • 269 • 289 • 294 • 295 • 349 • 368 • 391 • 459 • 461 • 462 • 463 • 464 • 465 • 466 • 467 • 468 • 469 • 470 • 471 • 472 • 473 • 474 • 475 • 511 • 588 • 597 • 630 • 631 • 642 • 643 • 1645 • 1646 • 1654 • 1666 • 1675 • 1704 • 1738 • 1754 • 1798 • 1803 • 1804 • 1805 • 1808 • 1815 • 1819 • 1820 • 1821 • 1822 • 1832 • 1842 • 1844 • 1849 • 1852 • 1869 • 1870 • 1881 • 1907 • 1916 • 1918 • 1922 • 1935
Some might be old maybe were moved to a new campaign so not impossible but check with Brittany
Was that the only thing that had an isssue?
I see that there were a few Case ID's mixed in with some of the intake questionnaire data. However, it was not always included. I believe that these are our matter ID's in Litify. Correct?
Can you get us a list of Lawruler ID <> Matter ID key pairs for everything that you've matched so far? This with some basic demographic information (ie. Name, Phone, Address, SSN, DOB) should be enough for us to validate.
Add the following in main.py of the tiktokevents project to fix the error:
from litespeed import register_error_page
@register_error_page(code=500)
def _500_error_page(request: Request, **args, ****kwargs):
logger.error('Error', exc_info=True)
return 'Internal Server Error', 500
Can you run and send me the flatirons
I just forwarded the one I got this morning
I’ll be in the office in 10min
estanban says they about about 1/4 done with the matching but will not know for sure until olivia gets back since she was doing the camp lajeune cases
Is the only code you have for leadspedia all-leads in GCP? or do you have github repo for it?
what are the api limits for leadspedia? (requests/second, etc...)
Can’t remember but I know they don’t allow concurrency
Hey can you send me the latest benchmark that your ran. Need for call with DL
Please review: https://github.com/shield-legal/Leadspedia/pull/1/files
Can you make sure you are in the NEC integrations meeting? Darrel says they want a technical person in the call and it overlaps with one of the meetings with Becker
Yes anything from Friday I should be aware of?
I’m probably going to be remote until Wednesday btw. I writing a note to everyone
I dont think so. Darrel was just telling us how Abe chewed him out for an hour on Friday for having a different order to the projects then Abe expected
He is just being that way to everyone. I can empathize with his frustration but it’s getting to be a lot
If we aren't we need to start uploading the DL case numbers back to the matched LR lead.
I will check with Brittany on if they are doing that, what is the list we need to work off of for that?
everything that we have a match for so anything we have a clean match for from the their Medical tracker, the list of matches Brittany's team has put together, or another place we are keeping matches. Just needs to have a home within LR. Need to establish it this afternoon and if we have to chunk the updates overnight I will need to know the amount that is being updated each night and then we need to have a report we send to them on how many have been uploaded vs how many to go
All the Camp leguene has been added to vitals
ok that is good, how many others? Is it possible to run a LR report to see them? If not then at the very least make sure all the matches are joined up in the DB and make it so we can run reports for them on that progress and for future reporting needs.
Talking to @deleted-U04GZ79CPNG we should be able to the same today with all the current matches from the benchmark file but they do not have time to look into finding the missing matches ~1k today
It was pushed an hour
How are you scheduling the LP rewrite you did?
How long does the get posts take to run?
currently with just the 2 reports allleads & allsold_leads it takes about 40 minutes to run and gcp has the 1 hour limit
super quick 19.97 secs
want me to merge and deploy to gcp (the deploy is currently manual [run ./deploy.bat])?
Well hold on let me double check with Brian because it pulled only and exactly 247 records for each date range which seems unlikely.
Fixed it now. Didn't need the pagination based on date. Only 265 records now. Should be good to deploy now only takes 5.23 secs
this endpoint seems to be different than all the others. You can only pull by DAY so if we want the logs since 2019 I have to iterate through everyday plus paginate if over 1K.
after we backfill then just a daily update
Was planning that for the other reports already
so I will need to have a separate pagination function for backfills and an append BQ func
What do you think of adjusting the code to run based on the value passed by Cloud Scheduler? For example if we wanted to update one of the endpoints daily vs hourly we could have a scheduler send just the daily once with its corresponding value and only trigger that function. I was thinking since we can have more than one instance running we can separate them out this way. Otherwise we will need to make a cloud run and cloud scheduler per table.
made an example in latest commit to the adds-Leads-Posts-Log branch
Correct me if I am wrong but we are just waiting on SGGH to make a decision with their tech group right?
If so, I was going to message them
we cannot have more that 1 instance running unless we have more than 1 leadspedia account
we are waiting on SGGH
The message from Troy means we are good to run the backfill tonight?
but I would plan on it yes
Here are the 3 wood pellet lead ids that were flagged as dups in ELG's system: lead_id 678271 688010 682602
ELG returns an idEntity and idRepresentation on successful posts ELG always returns recordId (which is what is currently logged in external_id)
leadid,externalid 678271,115545 688010,115145 682602,114421
Other than highly recommending you have admin (which is needed for most programmers / software engineers) it looks like a good choice for the team. I think there is even a couple of topics I might learn
Seemed like a comprehensive program and I have already started myself on my personal account.
Is that the plan? Personal accounts since its free? Or an org account to monitor progress?
Well I did it on my own to just see what it was like
I am still in the 7 day free trial
but I was thinking we have it as an org account to track progess
its a monthly subscription so we will want to cut it off at a certain point if people aren't making progress
It says it should be completed in 6 months with 10 hours a week. I imagine we should give them some time during the day to work on it, if there is no fire to deal with. No more than 5 hours a week of work time?
I think we still set a goal but not a monthly like I was thinking
I was thinking originally of giving them like 8-9 months to complete which was at 60/user/mo ($3240 for 9 mo) if we just used the company card for everyones account.
so this is better and technically get a year to use the platform for this course and others that might help.
Good with either. Maybe take a poll of the guys
https://www.debian.org/ https://rufus.ie/en/ https://www.ventoy.net/en/index.html
Had some issues this morning. I’m on my way in now
From Cam: we are in the hot seat with Meadow on a number of matters. Please report here in detail exactly what status and time for completion of Meadow API integration with campaigns. Please do so before you reply all to the email I had Andy send a few min ago.\
We are awaiting decisions from meadow right?
Do you have that timeline of all the issues?
We are waiting for fields for the following: • re-send of VGA with new fields (to cleanup their side of the data) • Paraquat - DL - ML - Shield Legal • Firefighting Foam - Douglas London - ML - Shield Legal • MD Juvenile Hall Abuse - BG - Meadow Law - Shield Legal • Firefighting Foam - ELG - ML - Shield Legal • Firefighting Foam 2 - VAM Law - ELG - Meadow Law - Shield Legal • Firefighting Foam - VAM Law - ELG/Meadow Law - Shield Legal • Paraquat - Wagstaff - ML - Shield Legal • Arizona Juvenile Detention Center - ML - Intake (+ possible remap of secondary)
Do you know what I might be doing wrong to get this forbidden error with connex?
I haven’t touched any permissions
But you know what I think they were removing accounts off of Connex to start saving money I would ask Ward to make sure they didn’t turn off our access
I have been talking to them. Do you have example code of how to use the API?
Can you either set up the Google meet so I can listen to our stand up or can you just provide me the notes of what was discussed?
*Thread Reply:* 1) For what reason?, 2) Yes, 3) They still haven’t approved our licenses but I will follow up and ask to add him?
Hey I could of sworn you put together a complete timeline of all the issues with SA. However, I am not finding it. Do you have it on hand by chance?
All I have is the timeline of ML VGA upto about a month ago: https://docs.google.com/document/d/1TjVpeFJC01uAVAkjEW-pZnyF8eqXRgGtRLpcsu8DulM/edit?usp=drivesdk
This is great. Can you work on updating the rest of the timeline? Was in a meeting with Cam and he is saying that it isn't going well and that SA is trying to throw us under the bus.
Will do after lunch
Joe is not required at all for the pycharm process
just when finalizing to access networks
You can hit escape and it's not needed
If he installs it then he has to update it. There is an update every couple of months
Ok then let Joe know that and help the guys set it up please
Have a new SA campaign for the Gomez Law Noodle Cup Burn campaign
Can you get with Rahul to get this setup ASAP?
This is for the weekly 1-1. Let me know if you can access and what you think of questions
The amount of questions makes me feel like it would be better to fill out monthly or quarterly. I do not like weekly goals, maybe tasks you hope to complete instead? could be monthly goals. (feel like goals should be more long term than weekly.
the first page, should be more specific on date such as select the monday for the week you are submitting. I do not like the things you are grateful for question. you have both name email address collected, could just do 1 or the other.
ps. I might be biased since I do not like goals in general
Ok noted, thanks for the feedback
If we get to choose how about shieldlegal.dev?
I was thinking shieldlegaltools.com
but I am good with it being .dev
I was thinking .dev since at least to start it will be internal tools only
Should I start the website in a new GCP project? or the integrations project?
Me and James got tickets to the google next conference through Ryan so we will not be in office Wednesday through Friday next week
We assumed that meadow would want integrations but we need their UDFs per campaign, which we might be able to pull but still need to know when the UDFs are done being built
Ok just reply in the thread asking for that, it’s fine
Hey that Chicago number was Kasia
I will store it. It was not in her signature and she did not leave a message
Can you call her
Ya she didn’t leave a message for me either
Call made. She is going to bug her IT people to get us access to their Zapier. Apparently the other marketing firms they have worked have setup the Zapier for them. She has never setup Zapier and does not know how.
Probably best we just set it up so we aren’t waiting on anyone else
I think Dwight mentioned that one of the Nevada Juv hall had the campaign naming convention that didn't specify the co-counsel/firm
Yesterday he mentioned the Nevada Juv Hall didn't have a designation of Law Firm like Nevada Juv Hall DL - Flatirons
Oh he was suggesting to put Nevada Juv Hall in Leadspedia to Nevada YRTC so that it matches Law Ruler
Just want to make sure this naming convention difference isn't causing any issues or wont cause issues moving forward if we have multiple Nevada campaigns.
Found why the leads did not send. We broke the code when we added the send to Crump part March 12th
ELG code and mapping fixed in dev. Ready to send those missing leads
In the sheet from Crump the first 27 leads are Wood Pellets ELG and have successful logs from ELG.
The rest (278) are from Drax Global Air Pollution SIL - Ben Crump - BCL - BLX which we do not have an integration on and are not related to ELG
Not sure how you wanted this file but I wanted to get rid of power query and only keep the data so I turned it into a google sheet: https://docs.google.com/spreadsheets/d/1nGztDJN3q9qx4ES5HceDe9hM4SOTlSxe/edit?gid=1849314769#gid=1849314769
ELG wood pellet data + integration logs
Meting with LR in Tony's office at 2pm
meeting salesforce people in #CR1
BG integrations using leaddocket: • MD Juv Hall Abuse - Bailey Glasser - Levinson - Shield Legal • MD Juv Hall Abuse - Bailey Glasser - Van - Shield Legal
I sent you my list for Joe. Any thoughts or comments on the software list?
Alex is asking if we can meet meadow about Michigan
I will be there. They asked for a test lead to sggh but we do not have the connection yet.
I have Dwight working on VGA to try to get that finished up
DL doesn’t have any non-flatiron Maryland Juv Hall correct?
Ok ya that’s what I was looking at in Monday in the open/close tracker
Once Ward gives Josh LR access can we have Gannon or Bree train him on LR usage (just basic agent usage that we can build on)? Maybe half a day of their training?
https://docs.docker.com/engine/install/debian/#install-using-the-repository
Should the shared calendar not be named Integrations Schedule? I only added integrations, who else should see that calendar?
https://blog.det.life/why-are-there-so-many-databases-87d334c5dce6
You invited Alex's BG email instead of her tort email to the SL Data Sync meetings (Also her, Brittany, Alan & Ward are the only TIP employees in that meeting)
You invited Alex's BG email instead of her tort email to the SL Data Sync meetings (Also her, Brittany, Alan & Ward are the only TIP employees in that meeting)
https://documenter.getpostman.com/view/5921778/UVeJM5kQ#58b4c01b-291d-49a4-b04c-d71e90e1ba07
I see 2 tasks in the DL Jira that you should look at: https://dicellolevitt.atlassian.net/jira/software/c/projects/SHL/boards/303
Alex does not have an account in Jira but Alan does
Her email on slack can only be updated by a slack admin, so Mike?
If it would help him understand what is going on or to help run queries to check if things sent from our side then sure
Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Scope CurrentUser
gcloud auth application-default login
did we actually get the updated api limits for LR
No, limits are still 1 call per second per endpoint
And that is why I built the tool for anthonys team to use
I do think most of those sheets should be replaced with the DB and website
That might be possible. Not straight from LR to Five9 but we can probably make it happen. Just have not been asked for it...
maybe this will also work. https://app.diagrams.net/#G1BRoyIW-CU9g-Mx-kiLd7SkcmthnbLevc#%7B%22pageId%22%3A%22R2lEEEUBdFMjLlhIrx00%22%7D
*Thread Reply:* leadcomplete
What sort of issues have you had with the machine?
So hdmi and it not charging? Or freezing?
freezing and saying the dock isn't powerful enough
Freezing would be the only issue I would be worried about.
has the freezing happened outside of the hdmi (outside of 5 minutes before or after plugging or unplugging the hdmi)? such as random freezing in a conference room?
I may have a fix for the freezing but it requires going into the bios (a reboot)
Was Paul and Michael given approval for more development? Last I was aware we asked Paul to finish the documentation, not build more features
I asked what other things they were looking to develop but that wasn't a green light yet. Why what did you notice?
Maybe the message was not relayed to Michael
Maybe the message was not relayed to Michael
Ok I told them to hold off and that I’m short staffed so let’s talk about it all later but in the interim got the get access to at least use the report builder tool
They can build if they want to accept that file with needing to actually have access to our code that generates it
Therefore can you add them to the portal so they can use it
Brittany has that permission and should have already given them access and showed them it. She was supposed to do that a couple of weeks ago
Who else is out? The calendar only shows me...
Me I decided to go to California last minute. Flying back tonight just didn’t put on calendar
Hey I don't have access to the web portal project
Ok thanks I asked dwight and he said the project was LawRuler UI so I didn't see that
no, lawruler-ui is the github repo that holds the db dump and bigquery scheduler cloud functions that are in the tip gcp project
the github repo for the website is internal-tools-site
ok cool thank you for the clarification
Has your laptop crashed or frozen since I changed that setting?
Also, do you have any update on VGA?
The intake is complete, we need to work on the secondary next but Beckers says that SGGH campaigns are higher priority
The error @deleted-U08GMHAPRK2 gave me has to do with a sheets add-on to auto dump email attachments to drive. https://workspace.google.com/u/0/marketplace/app/save_emails_and_attachments/513239564707?flow_type=2
The error @deleted-U08GMHAPRK2 gave me has to do with a sheets add-on to auto dump email attachments to drive. https://workspace.google.com/u/0/marketplace/app/save_emails_and_attachments/513239564707?flow_type=2
With the data policies @James Turner is writing up, we should check these sorts of add-ons for data leaks
what is the purpose of this repo? https://github.com/shield-legal/Five9
I have to run home to fix my insulin. I will grab lunch as well. I plan to be back in about 2 hours
I was mistaken, the code from the website uses the following subquery for e-sign date: (SELECT MAX(date) FROM lead_history_status WHERE leadid=l.id AND tostatus = 'Signed e-Sign' GROUP BY leadid) AS esigndate,
List of tasks for me to work on while your out: • Add Documents table to DB dump • Add Storage bucket for Documents and Contracts for DB dump • Get 8 SGGH campaigns approved (@Dwight Thomas) • Start and finish 9 SGGH campaigns (@Dwight Thomas & @Zekarias Haile) • Do 2 ELG campaigns • After meeting with Cooper Masterman on Monday, work on their integration • Helping @Tyson Green & @Chris Krecicki get up to speed • Help @Daniel Schussler with DL integrations • Monitoring and fixing any issues with automated functions: DB dump, five 9 recordings, five 9 reports, integrations • Helping with anything that comes up • Code reviews for @Tyson Green, @Chris Krecicki & Integrations
I do not think using GDrive will work for Cooper Masterman due to GDrive permissions (the API is extremely complicated & the permissions are worse) since ADC (Application Default Credentials) do not work for GDrive. I cannot get the permissions set properly after half a day. I also do not think that giving them access to the GCS buckets for their campaigns would be wise since there is no hand off at that point. What are your thoughts?
Can you determine which leads sent vs didn’t send emails and see if there is a common date that it started/stopped?
Here is the list that was sent via email and when
Ok can’t really see all on phone. Don’t need to do right away but do an analysis of where the leads that didn’t send and when their e-sign dates are to determine if there was any commonality of when they stopped. Since our system is based on the emails then they would never have received even with an integration.
We sent all leads via integration last Friday
Ok so even the ones that did have an email they didn’t have in Litify
So would you agree it’s a bit on DL’s team that they didn’t enter them in manually like they had with other campaigns?
Baby Food - Dicello - Shield Legal ASA NY Sex Abuse - Dicello (DLG/TC) - Shield Legal L Brand Tampon - Crump - Dicello - Shield Legal Paraquat 5 - (DLG/TC) - Shield Legal Camp Lejeune - Dicello - Crump (DLG/BCL/TC) - Shield Legal Hospital Portal Privacy - Dicello - Shield Legal Camp Lejeune - DLG - GWBP - TC - Radio Camp Lejeune - Dicello/Ankin - Shield Legal Social Media Teen Harm - Dicello - Shield Legal Camp Lejeune - DLG - GWBP - TC - TV Paraquat 7 - DL - Shield Legal Camp Lejeune - DLG - TC - Shield Legal
All the email automations are only set to trigger on language of english
should be a column in the db we can search by
Ok how many of those cross reference to our list of 171
Ok how many of those cross reference to our list of 171
The NEC (708489) and a hair relaxer - dl - flatirons (587246)
The NEC (708489) and a hair relaxer - dl - flatirons (587246)
But anthony was saying this is a recent change (around april) where they told the agents to only use english. only those 2 of our 171 were changed from english
Well that’s a good catch nonetheless
Working sheet: tests are denoted in R and backfill in V (~## is not done, #/## is date completed) https://docs.google.com/spreadsheets/d/155xSWtVyYOqqzKjUhM2C2hN4kAQZg-mH/edit?gid=203360950#gid=203360950 DL sharepoint: https://themedialaboratory.slack.com/archives/C0976UMQYHW/p1753201301949249
1717 - 809062 1675 - 806462 1791 - 748367 2041 - 807826 1804 - 808894 1995 - 733196
Do you remember anything about Danny at ACTS and countersigning retainers?
Integration error log view SQL
create view integrations.integration_errors
(id, contact_id, lead_id, case_type_id, storage_url, triggered_at, destination, external_id, processed_at,
response, document_response, status, destination_disabled)
as
SELECT id,
contact_id,
lead_id,
case_type_id,
storage_url,
triggered_at,
destination,
external_id,
processed_at,
response,
document_response,
status,
destination_disabled
FROM integrations.integration_log il
WHERE (external_id IS NULL OR response !~~ '%[20%'::text) AND NOT destination_disabled AND NOT (EXISTS ( SELECT 1
FROM integrations.integration_log il2
WHERE il2.lead_id = il.lead_id AND il2.destination = il.destination AND il2.triggered_at > il.triggered_at AND il2.external_id IS NOT NULL AND il2.response ~~ '%[20%'::text))
ORDER BY triggered_at DESC;
The piece that does the date checking is the join on self then il2.triggered_at > il.triggered_at
Did you include @James Turner in your list to Joe?