Hey there~ I came up with something I think you and your team might find useful
Its a basic little python set of tools that I call Cartographer
2 main tools, one is Mr.Context which takes 2 different excel columns and uses a basic machine learning model to understand context on both, find the best match and spit that out as an excel file
Interesting 🙌:skintone2: I haven't heard of that but I am interested in look more into it's library 🙌:skintone2:
Oh yesss I've done a fuzzywuzzy match before (love that name) lol
So I am assuming it may learn from certain data responses and give best match recommendations ?
Exactly yeah- if you want we can set up a 1:1 tomorrow and I'll show you how it works
Yeah I am free tomorrow for sure 🙂 I appreciate you giving me the friendly heads up :catroombaexceptionally_fast:
Hey good morning man- This is the first run so you might run into hiccups here and there but this should work for ya
I'm gonna send it from my other acc because it won't let me send an application here
Good morning ! Oh okay no worries 🙂 I got it ! 🙌:skintone2: Thank you James
Oh my way to the new office now, may be a couple mins late
No worries James let me know when you have arrived so I can meet up with you
Hey man, gotta sync with you when you get the chance
I was supposed to be out of this meeting like 40 minutes ago so hopefully soon lol
Sure thing' I'll try and do it asap, I'll head over to your desk
Oh I see if you are at the office then once you are off that meeting that works with me. I wanted to grab a bite but I prefer to do so afterwards if that makes sense
Hey James. The instructions Joe gave me was to first text him. Once he gives you the green light that he is ready to help assist you he will then request for you to reboot your computer and to relogin. Once you are logged in to just wait for him to take control of your computer and from there he will be able to assist.
My bad, I didn't realize you were in this too
No worries ! I was looking for you not sure if you are at conference room lol
After this, let's try and sync with Nick so we can get a better idea of why ryan is running you through all this and what he's talking about. My understanding is he would be having another team do the finance stuff, not you
From my understanding I am helping with the S1s and declines, etc and he wants me to understand some of the "background" process to possibly create a python code or sql query to automate that from his end. As for specific finance #s I think that he isn't teaching me that stuff but rather it is just a coincidence with the lessons he is trying to show me that those are being shown
Please for the love of dog, don't ever name your commits "dasdh"
Are you following? 100% okay if you aren't
Some but not all since I am getting pulled to do another task (to help a fellow co-worker with a script I built for the team)
But I can still stay on since I am just about finished with it
I kinda want him to figure it out first then say it or worst case I will review the recording
Plan to keep our meeting with Ryan for now, but know that I am in a meeting with him now and it's set to go long
Cool stuff- Meeting with people from Google
Question James, you said pulling any type of Query from BigQuery would be like $1 per a pull?
So not quite, the larger the query, the more PC power necessary and that's what costs, but that can add up very quickly. What are you thinking to do?
Oh that makes sense. Nothing quite yet; just want to know what my limitations are before creating stuff and also I recalled Dustin and Ahsan talking about $ basis so just wanna make sure I recall the correct details
Best path for now would be to build the queries in postgresql and talk to Dustin and I before sending anything through. I recommend mocking up schema blueprints of what you want things to do/look like
Copy that. I recalled installing PGAdmin but did I also have to install postgres too?
Good question- pg Admin is kind of like vscode for postgres SQL databases
Yeah I want to see how can I create my own "testing environment" that way I can create things and use the SQL skills I am learning and continue to learn as well
during freetime after work I have MYSQL installed so I am running off that to practice with but postgres and mysql for queries doesn't seem too different
They are very very similiar, I think pretty much the same outside of a new minor verbage differences
Hey are you busy today? I got some cool info and wanted to discuss the reason for that project I posted about
Hey james ! I wouldn't say I am too busy to check it out; just updating the create new leads script / LR API thing I was mentioning about
Edward's LR Endpoints / API Scripts Hey James. I am sending you a link where I am posting for scripts I built that are adhoc in terms of using law ruler API's and ones that are scheduled weekly. I am still adding more to it (as I search my codes) but essentially this should cover all the codes I built so far that uses the LR API for different purposes
Heyo- Notice will be going out to the broader company soon but pause all reports from lawruler and any code that touches the lawruler API
Hey @James Turner thanks for the friendly heads up. I am going through all my coding as we speak to ensure nothing is ran
Hey @deleted-U07UDN65C3U sorry Ryan is stuck with a meeting with me but wanted you to hop onto this meeting if you can: meet.google.com/cnf-umga-bhw
Yaknow how we were SUPER careful about how we named everything yesterday on the schema?
Heyo, can you please check if the pk_id columns are integers on ryans Postgresql tables?
I'm planning on sending this to ryan but I want to be sure first..
Hey Ryan, I’ve made good progress on the status rates table. I’m also using this as a teaching opportunity for Edward, who is shadowing me, and he seems to be picking it up well! I did have a couple of questions regarding the specs for the table:
pk_id, I noticed that it’s currently specified as an INTERVAL. Since the pk_id in the PostgreSQL table is an integer, did you want to source it from that table, or should we consider another approach?external_status_name_desc, should we populate this with a default value for now so it can be properly populated using Brian’s tool?
Thanks for your guidance!@James Turner Here is an example of what it shows me in PGadmin
I believe PK integer or PK id will be an integer and looks like LR ID # to me
Also the number I gave you and Nick W about 14,500 fields, Brittany and her team are going to narrow it down some more to see if there are clients who fields don't need to be updated (since they already have the data there) so essentially before I send out that email she is going to look through the data before sending out that email. As a friendly heads up.
If you wanna come to my office for the sync, youre all good to cmon over
I'm getting all the final bits and bobs put together for the status table Ryan requested last week ahead of our meeting with him today, but wanted to give you the opportunity to look it over~
Gotcha yeah I hope you were able to fix that issue / bug from last time and I can take a look at it if you like. Do you want me to popby ?
I started early this morning and didn't drive in yet because I was locked in on projects lol
Ohhh that makes sense haha. I thought you might of been here already since I can see the lights on your office from here
Now rather than create 2 seperate automation scripts, I created one that should do both.
I know you are new to SQL but if you wanna look over the tables and let me know if that query makes sense I would appreciate lol
As far as I am seeing I can't think of anything I would change about it
I know we discussed leaving out things not on that excel sheet, and the pros and cons, but as ryan wrote it on the requirements (excel line 19)- that would insert them anyway?
Sorry I am not following what you mean by that. Leaving out things not included within the excel sheet makes sense but I am not understanding line 19 very well 😅
You're all good, the way he wrote it was confusing
he said to make a piece of code that inserts data from the ID column of the the lr_statusus table to the new status table every 5 minutes
insert being something that puts new columns in, so we don't actually want to insert every time
because you will get a bunch of duplicates
My understanding of what he was asking for was a way to see if it's already in there, and insert new ones
Yes should check if they are already there or not; create if not and then update periodically I suppose
Hey James. Sorry to be messaging you so late. I wanted to let you know that my throat after work is starting to feel sore and it is a possibility I may wake up tomorrow feeling not so well. In an event like I don't feel well tomorrow I don't want to go to work at the office and risk the possibility of getting others sick so I may try to WFH tomorrow and/or simply just take a sick day. Sorry 😅
Hey there' You are all good. Thank you for letting me know ahead of time. Get some rest!
I spoke to Nick and let him know you told me yesterday that you aren't feeling well. Are you still okay for our sync today?
Nick is concerned about the flatirons stuff
Hey James @James Turner I am feeling better. I am still coding that stuff for flatirons since I know the higher ups want that asap for their meeting tomorrow
Yeah we can sync today if needed I just got out of a meeting with Brittany to get her team's feedback for the flatirons benchmark to be prepared for a meeting I have with Tony
Right now it is simply just "theory" based but its like the saying goes, if we build it they will show lol
update FINLOG // table name Parentcasetype = "oldname" typeofcase = "Michigan Pollution INJURY - Ben Crump Law - ELG - Shield Legal" retainercode = _
where leadid IN (1,2,3,4)
What is the parent case type? get list of lead IDs to change to new case type What should new retainer code be?
Hey James ! Just wanted to give you a friendly heads up I was successful in updating those 371 leads in PGAdmin.
here is my SQL code just in case you need it or want to know how I did it:
-- Update the fields in the public.financial_log table
UPDATE public.financial_log
SET
parent_case_type = 'Michigan Pollution',
type_of_case = 'Michigan Pollution INJURY - Ben Crump Law - ELG - Shield Legal',
case_type_retainer_code = 'Michigan Pollution INJURY'
WHERE
type_of_case = 'Michigan Pollution - Ben Crump Law - ELG - Shield Legal' _-- Match the old case type_
AND lr_lead_id IN (658388, 663236, 651483, 655279, 655535, 656447, 655219, 653802, 656108, 656083, 655182,
655203, 655725, 653344, 655099, 655573, 653980, 654253, 655162, 653837, 654962, 653819, 654486, 654478, 654020,
653870, 653862, 653782, 653593, 653709, 653499, 651512, 651608, 651468, 653798, 653967, 653716, 653814, 655627,
653368, 655094, 653954, 653794, 651435, 653749, 653796, 654149, 655185, 655274, 654482, 654468, 651634, 655504,
661719, 655175, 651452, 654813, 653610, 653728, 651450, 654609, 652618, 653772, 656193, 655104, 654465, 654430,
654289, 654860, 653851, 654062, 653975, 654134, 654029, 653712, 651485, 652124, 651792, 653961, 651473, 653955,
653722, 651480, 655957, 653389, 652732, 651448, 651525, 663261, 652927, 651493, 653758, 658363, 660639, 653255,
663076, 661174, 658333, 656652, 654444, 657393, 664639, 664844, 663079, 664597, 664519, 664291, 664439, 664397,
664325, 664258, 664202, 663797, 657609, 663390, 663072, 662924, 662473, 661656, 661654, 652052, 661254, 661415,
661608, 661566, 661580, 661544, 661321, 661247, 660988, 660164, 660906, 656055, 655616, 660181, 660182, 657851,
658406, 658730, 660062, 658512, 659756, 658003, 657100, 656972, 657543, 659319, 658504, 659366, 659255, 659150,
658977, 658203, 657574, 657008, 653896, 657761, 657532, 657818, 657790, 657607, 653592, 657544, 657278, 655125,
657425, 657134, 657149, 656995, 656811, 656794, 656619, 653793, 663557, 658667, 663655, 657793, 665005, 661328,
660146, 660184, 660047, 657759, 661993, 655180, 663873, 663632, 663535, 661317, 657913, 659822, 657740, 653865,
664185, 663093, 659962, 664241, 663036, 663369, 663282, 662656, 661075, 659950, 659002, 657936, 657881, 658137,
654498, 656805, 656292, 655859, 661034, 655336, 659301, 659188, 659291, 656993, 656939, 658716, 654396, 657489,
657201, 661927, 662012, 661037, 656118, 657137, 656416, 662967, 659535, 657007, 658596, 662206, 660983, 660668,
658776, 659356, 658278, 657126, 655881, 655660, 664477, 661303, 661241, 664622, 663263, 664871, 664846, 655677,
664370, 664151, 664119, 663726, 662806, 661868, 661262, 663306, 662906, 662864, 662600, 662619, 662428, 662560,
662544, 662073, 661828, 661983, 661712, 661732, 661555, 661463, 660599, 661350, 661129, 661229, 658753, 651964,
660032, 657225, 659537, 659868, 657821, 658997, 658715, 658317, 658938, 657372, 658466, 656770, 658275, 658232,
655209, 654237, 657780, 657413, 653808, 655626, 660112, 655984, 657716, 660190, 664351, 664295, 664318, 664261,
653859, 663420, 662826, 661301, 661202, 659113, 654366, 659380, 657594, 656332, 657724, 654150, 656969, 663875,
663728, 663439, 657623, 663219, 651422, 656219, 655149, 658528, 651598, 654144, 655299, 655807, 655151, 653998,
653930, 653882, 653707, 651496, 653912, 662147, 659435, 654393, 654200, 653626, 651449, 654606, 655476, 654383,
653731, 654423, 653561, 654148, 653850, 655192, 653908, 656216, 655184, 654982, 653231, 654816, 654583, 653598,
653810, 653624, 652121, 652102, 654879, 653987, 653600, 653785, 651467, 652675); _-- Match specific Law Ruler IDs_
Also we updated those 371 leads in law ruler as well through the import wizard so Sobo should be good from his end as well
So my part and Sobo's part should be complete if that's all there is needed for it
Hey James. I just wanted to confirm with you if we still good for our 1pm
Yessirree- I'm on my way over now. May have to call it a lil early bc I have a cam presentation right after
Just food for thought since I used it in the past and became a habit so that I don't need to type in \ but rather just \ lol
I appreciate you filling me in on that! I've been doing the double slashes forever and making sure my path names have no spaces. This is much easier!
Hey there~ is it okay if we cancel this 1:1 and re-sync later in the week? Working on a piece of code here and allllmooostt there (maybe)
Hey James. That works with me since I am in the same boat as you (working on flatirons coding stuff) haha
My guess is the reason they told Ryan to make a seperate project for each data set, is so you by default only include the information you want, rather than import everything and have to manually hide and edit each item
My understanding:
project - all of the connections you want for a project
View - a read only view of a table that you are connected to measure - a way to get new results, from existing data points. For example, a measure might be a count of how many leads you have per case.
explore: all of your tables from the view + all of the measures you created.
visualization: the actual graph/bar chart/table visualization, that you want to put into a dashboard.
DBT is taking the role of creating measures from lookml
Yes I believe that is what he is referring to 👏:skintone2:🙌:skintone2:
If we have to postpone I understand since I am working on finishing up the flatirons script
Hey there, I apologize - you're good to keep going on with the flatirons stuff. Let me know if you run into any trouble with it
No worries, I appreciate it ! It has very tricky logic to it but I am getting closer and close to solving it as I debug / test things through print statements
I have a better idea on what DBT is now that I looked it up. I think just having a reference key to be able to look at at any point (just in case we forget or have a new person) it will tell us the name of the folder and what it does and what goes into it
Then we have another reference key of each bucket that is important in Bigquery and summarize what is important in it and having those two reference keys should do it
From what I'm getting out of this.. it's kind of like jupyter but for SQL
But instead of different code blocks, it's different files, in different sections
I see yeah cron jobs and having each folder having its own purpose such as a folder for raw, a folder for analytics, etc
Yeah and having difference sources and combining them into one
I don't really see why this is needed vs just properly organizing GitHub and using code but ehhh
I see what you are saying. I guess its the cron job part maybe?
I think next time I need to break it down like how we did in person with him where we wrote down what the names are, what they mean and what to do with it.
Telling someone "Make a table" doesn't mean anything when you don't tell them what you want the table to be.
Do that for each part and then maybe go over the "possible challenges" and that should do the trick (at least for my end for understanding)
Yeah- I am attempting to slowly lead Ryan to doing that, and I think the finance log stuff is a good example of him starting to understand that we need that information to do our jobs
Yes I agree. I am in another meeting with him as we speak and this one he is going one by one which is a lot more helpful so if we can push our meetings to be more hands on, slower and writing each step & it's purpose, etc I think long terms we can get things done faster and accurate
For the finance log stuff (https://docs.google.com/spreadsheets/d/1p9QZpMZX4ww4AXFAnu1nJRb8XCozzWEGtjZK1UWPmQA/edit?usp=sharing) I am in the process of attempting to get LP contract ID on each campaign right now, but running into some issues because of how the contract ID is pulled from Lawruler initally
Hey there, in the postgresql tables, is there any leadspedia tables that show the contract ids?
I looked through every single table in both the tip and shield bigquery and found contracts, but nothing I can join them on.
Good Morning @James Turner Hmm I can send you a screenshot of all the tables and example values (of a row) to see if that answers your question?
I think that might actually be what I need..
I didn't realize we had data on postgresql that still wasn't viewed over
Do you see a way to share it with me, or to create a view to Bigquery?
I know it is tipmaster ---> schemas ---> public ----> Tables ----> financiallog
Hhm- I'm not either, but it's important that I get access to that postgresql data if we wanna make this finance table make any sense.
I apologize for taking you away from what you were doing but can you check to see if it is mirrored on Bigquery. I looked through every table but I /hope/ I just missed it
If not we may need to ask Ryan where he is getting that data from
Well shit- This might be it.. I wonder if I overlooked it because the number of rows don't make sense
we have 645,000 leads in LR, and this only has 110k rows]
Hmm not sure about the discrepancy but it is the closet table I can find for that field name
That table that you originally showed me in Postgres
Unless these are just "billable" and others are not billable ones
Hey man, welcome back! How was your day off?
Thank you James. It was nice to take a Monday off 🙌:skintone2:
Got my script to work and working on creating a nice little dashboard to view
Abe (our client) his favorite color is orange so for some extra points I went with that theme lol
I was thinking "Wow this is really orange"
How did you get that orange is his favorite color?
Brittany told me from a conversation she had with one of his employees who makes him plenty of tables
I couldn't find my headset at home that I bring to work, is it cool we hop onto it / listen from your office ? If not I can ask sobo to borrow his
Sorry for late reply, I was getting VSS opinion on our sub category sheet
I think since we are making the tool for everyone it would only be fair to get their opinion too
Can it be in person for 1:30 pm since I didn't bring my head set today
Let's move the sync until tomorrow. Ryan's going over and it's throwing my whole schedule out
Hey @James Turner Getting out my flatirons meeting. That works with me. Gives me more time to work on my code
Hey there, is there any way we can incorporate the subcat stuff first? Already seeing it where it would be helpful on a few other projects
Good morning James ! Sure I think that should be a lot easier to do since that is just a vlookup / xlookup from my end and the Tort Type that one will take a bit (not hard) but will need to ensure that one is correct
Once I am off my meeting with Ryan I can start that process (sub/cat) if needed
I appreciate! Just to clarify what I mean is the intake/bcl/secondary bools and the secondary_category
I'm starting the process of the "All Case Types in LR" page into BQ and there are about 44 errors that are preventing things from being brought in as a proper boolian.
For example:
in column L, which is a true false for if it's a Sec case, instead of saying "Yes" it says SEC.
Am I okay to change those kind of things and clean it up?
Thank you ! I am writing up a SQL for it so once I have that good to go I shall send it to him to review and then run it since it will be updating contract revenue (costs) so no room for mistakes lol but if he gives me his okay on it before running should be good to go
I'll keep you updated on my progress on this side- I wanted to make sure it was okay with you before I edit your thing haha
What he mentioned made sense to me (For most part) but need to confirm this from his end and then I think I will be good to go from his end.
Yes that works with me ! Feel free to make a copy of it and edit how you think is fit !
Sorry Ryan and Flatirons has my attention this morning 😅
Oop wrong one
How are you doing? Is it alright if we push to 1:30? hitting a flow
Looking at this here, and it looks like the Docs cases are showing up as intake, do you know why that is?
Could be possible it has both but without the case type name I can't 100% say
I may have read it wrong, because Im searching for dupes but the one populating as intake true is • CA Juvenile Hall Abuse DOCS - ACTS - ACTS - ACTS
Looks like it might actually just be that one
I went with DOCS since Alan told me that they are separating things based on case type name and I was informed DOCS meant docs
That makes sense, I think it was actually just that one case type that was incorrect
I can change on my end, sorry for confusion lol
Update on screenshots above. I had the chance to review the data and looks like a lot of them matches the S1s between Ryan's table and the BigQuery table. The reason one is 1048 while other is 76 is because I think his table doesn't include specific statuses that are not "final" while the bigquery table has all statuses (final and disqualified, etc)
Didn't realize this until now however that would explain the discrepancy between the #s
Good catch! That is what I was trying to figure out earlier when I mentioned we didn't have status discrepancy on my table
*Thread Reply:* Yeah I forgot about that but my main focus was to get all those S1s back lol
I see you trimmed " " but are you including blanks or "spaces" in those searches\
I believe it is only including ones that have values that are not blank and not spaces?
If you add OR " " to the IS NOT NULL line
I think the trim just removes the empty spaces, but removing them wouldn't make it a null value
I think this should remove ones that just have spaces to it " " and ones that are null and only keep ones with Signed e-Sign statuses
Talked to ryan and actually learned some helpful stuff
Sounds good 🙌 I look forward to it for tomorrow (at home replying back since I just saw this 😅)
Doing well! Working on a lot of ryans stuff. I don't have much to update on, because I'm still sorting out the weird in most of it
I hope there isn't too much weird stuff about it haha.
As for myself I am working on building this for Luis and Flatirons specifically:
There are a few parts that my script must do as well in order to inform Flatirons about discrepancies / data that needs to be adjusted from their end. I went ahead and wrote out what was discussed to me in a meeting I had with all the parties involved and got them to confirm with me thru email that this is what is needed and if so I shall build it just like this. Since last time I build the code the way that it was mentioned to me I did it like that, however one person asked me if I was factoring in something else when that wasn't told to me at all. So basically just having it documented that way if it wasn't something agreed on then they can't say it was me lol
However I like to get your opinion about something for our 1:00 pm if you still have time
Sure thing! Swing on by, Nick is here as well so we can talk to him
Hey James. I am looking at those tables you mentioned in bigquery and got together with aidan about it and we probably still need more clarity on it. Aidan had a task where he would calculate talk time for intakes and secondary, etc and he is transferring that task to me since Ryan wants me to do it but since it is my first time learning it I am not sure what I am looking for in the terms of what fields in the bigquery table wise and I showed Aidan the same thing and he isn't sure. Also based off what Ryan is saying he wants us three to get together to solve this and I feel like we may need David as well since he is the five9 guy. Aidan just needs to know "How to pull the reports" which then he will show me so that I can do it from my end too. However Aidan leaves before 2:00 pm and looks like Ryan needs us to have this solved by today for Rose for financials
Sorry I left for lunch right before you sent that~
Time time per section like screener, closer, ect is going to be a really difficult thing. It's on my radar but right now we have no way in the data to separate what job a person does.
We might be able to go by status change but it's not reliable
Yeah I understand. Even Aidan said this will be difficult to calculate too
And I think Ryan may think it is a simply task but I think we may need to add him into a meeting to discuss this with him or maybe David who works with five9 all the time
I brought it up with Ryan briefly, and he told me not to worry about that part and just do the other aspects of financials for now. I didn't know he was passing it off to you guys
I talked to David briefly but we can try again
Oh okay I appreciate you talking to Ryan about this. If five9 is covered from his end you may let Ryan know that we are good from our end too since we did the connext and vici costs already and was just waiting on the five9 stuff
Hey there' I would like to get Nick in on our 1:1 for today if we can so we can talk over the request from Ryan and figure out what he needs + what he was getting from connex before
Hes not in the office right now, but I'll let you know when he gets back in- he just stepped out.. but I'll take a look at that for now
Oh okay sounds good ! The top of it (line 1) I put the link to the spreadsheet that we use to calculate those values
We basically take a report from connext and vici and paste those values in the spreadsheet to get the revenue costs. The spreadsheet / google sheets has formulas to it where it calculates the costs based on the values we put in which comes from a report from connext / vici.
I wasn't shown how to generate those reports since I was told they are no longer or will no longer be using those anymore and be using just five9 so I had aidan generate those reports for me and from there he shared his screen and showed me how he calculated those values
Give me more time to go back to coding for Flatirons now that I finished Ryan's stuff (the S1s) last week
Hey there, I just sent 2 google sheets to you, can you please take a look and see what leads these aren't lining up on and potentially why?
Hey ! Sure I can take a look to provide my feedback :catroombaexceptionally_fast:
No problemo ! May you send me SQL for it as well (the one you used to calculate your 99)
WITH rankeddata AS (
SELECT
**,
ROWNUMBER() OVER (PARTITION BY leadid ORDER BY statuschangetime DESC) AS rn
FROM tort-intake-professionals.Financial_Log_Dataset.pctid_x_data_with_billable_status_and_rev_rates
WHERE (status LIKE '%Signed e-Sign%' OR tostatus LIKE '%Signed e-Sign%')
AND status NOT LIKE '%Secondary Interview%'
)
SELECT **
FROM rankeddata
WHERE rn = 1
AND DATE(statuschangetime) = DATE('2025-03-17')
AND marketingsource = 'Shield Legal'
-- AND intakebool = TRUE
AND testbool = FALSE
-- to do: Do not include tests
But since signed & declined doesn't have signed e-sign to it I think it may be a timing thing such as it was signed & final but the bigquery table needs to be ran again?
Where are you seeing Signed and Declined? I'm not seeing those
Interesting, yeah it looks like it is on intake under review on mine and does not show at all under is. I wonder if I change my query to have to include the to_status AND current status rather than OR
Also just updated the Ref table so in a bit I can re-generate
Question James - non-work related; I know you mentioned you are into hiking and outdoors. Did you ever get into any anime or into specific T.V shows ?
Lol a bit, I am watching Solo Leveling now and I am a big Cowboy Bebop fan
I see I heard of them both. Cowboy bebop is a classic & heard good things about Solo Leveling.
I finished watching Overlord & now watching full metal alchemist brotherhood since my friends told me I am not a true anime person unless I watch it lol
Haha I did enjoy FMAB quite a bit! It's been YEARS since I saw it
I haven't seen yu yu hakusho but i heard good things!
If you have a cheap projector or any projector really, I recommend using it to watch those older animes because the way it emulates those art styles are mint
Ima head out for lunch but I'll be back later
Yeah I heard good things about it (FMAB) and currently watching it so I can see why everyone loves it
Yeah I see what you mean. It's like playing an old video game. It's better with an old TV compared with new ones since it gives it a nicer touch to it
Hey there, any other findings about discrepencies?
That is the main discrepancy I found so far. If possible can we see an output of how that looks when it is resolved to then determine the next discrepancies (if there are any left)
No meeting today, trying to figure out the ryan stuff + some stuff for tony
but if you have any updates, feel free to slack me!
Understood. From my end I am working on a function that will take flatirons medical record report and place them in google cloud storage as folder (based on month year and day) to keep the raw data .xlsx there. Since there were multiple instances where we had to go back to them to show screenshots of proof where their system showed certain values they were questioning. So far it is going well !
When you do have time I can show you how it looks once I also tweak some of the bugs
Today is tough because I got an impromtu task from tony on top of the ryan one, but tomorrow works!
No worries ! Hope all goes well with what tony is asking for including ryan's request :catroombaexceptionally_fast:
Hey there~ Thank you so much!! I really appreciate it
Oh my gosh I just saw you sent that at midnight haha
Ah that makes sense. May need to export as excel / csv to do a comparison lol
Looker presents it as a unique letter for some reason
Making things works / match is definitely a unique thing here 🤣 joking
At least that's what Looker is probably thinking
Hey James. Taking a lunch prior to our meeting as we speak :catroombaexceptionally_fast:
Cool! When we do our meeting I was wondering if you could take a look at this for me and tell me if it makes sense? https://docs.google.com/spreadsheets/d/1b9GRCOvmbiisRjG6_CmhjJ6KhdJE6aQg1TDrqYwH-HY/edit?usp=sharing
I need to go through on Lawruler and do the manual confirmation for each lead like on the manual confirmation tab.
Sure I can 🙌:skintone2: I'll give you feedback on it when I'm back :catroombaexceptionally_fast:
Hey there, I want to attempt to validate my finlog stuff. On the Shield Reconciliation tab, is that JUST shield as the marketing or everyone?
I imagine just shield but I want to make sure
Send me the link of it one more time. Sorry. But based off the screenshot it looks very appealing tbh
Those colors and design look more nicer on the eyes compared to darker colors, etc (enough though I like darker colors better but was told brighter colors are better for dashboards?) but looks nice !
I got a lil fancy with it and made a lil logo/icon to show it's the new system
Yeah it is very clever ! Tomorrow I'll more into it for sake of time but looks nice to me !
I can show you tomorrow as well the code stuff i've been doing for flatirons since I am just about finished with it but testing it over and over and adding some improvements here and there to ensure things goes as plan lol
That one day we had a meeting between all of us in person with Ryan, he had sent joe a slack message and tagged him (joe) and myself. I did not see Joe reply back to it so I asked Ryan for a follow up (to see if joe message Ryan since I haven't heard of anything
So my question is, to ensure I don't "over step my boundaries" is it possible for you and/or Ward to request it officially? I ask in the past from my experience, they have always told me "we need to see the request from your supervisor" which IT and Joe had told me that before. So having my supervisor or someone above me ask I think may solidify the request?
I don't mind walking to IT and asking myself if needed but I think they may want something formal and from my higherups
Hey there, happy Monday! I completely understand the hesitancy with this, you're golden! I'll draft up an email to send over today. I just had IT put in a third monitor for me so this is good timing
Thank you I appreciate it ! Once you do send that over to them, I will then inform Ryan about it that way he is on the loop too since it sounds to me he wants to be on the loop as well
Can you send me a screenshot of the request from Ryan to Joe?
Also to be clear, I am not asking for a brand new laptop or for them to order me one; It would be to use a laptop that is already available where no one has claim it yet (which I was told they may have those)
Then we received no reply nor any emoji reactions (from joe specifically)
Please remind me your exact title again?
was it Junior or Associate data engineer? I don't have all the paylocity stuff
Technically it should be Junior based on what google says lol :catroombaexceptionally_fast: but I am okay with just data engineer too haha
Can you send me the screenshot again but a little higher so It fully shows the date that message was sent
No worries at all, Just wanted it saved for my records
Thank you ! I received that email too :catroombaexceptionally_fast:
Hopefully that should get us a prompt follow up
Hey there, what happens when you go here: https://github.com/shield-legal/NewFin
Hey there, do you know if the current/old financial system uses the rev rates IO table?
Let me know what you get back from Ryan- I appreciate!
For Call time, is it total call time in Five9, Call time per campaign, or call time per billable lead or something else?
*Thread Reply:* This i am not too sure. That is why I want to get together with Aidan about this to see how he calculates it
I believe for military records they lump multiple campaigns together and they can only do a military record one if the original campaign is already e-signed so we would just remove that entirely
So far I am pulling all of the call time for any lead that went billable in the month of march
five9.date_and_hour,
five9.bill_time__rounded_,
five9.lead_id,
leadrate.io_call_time_minute_rate
FROM five9_source.five9_bulk_call_data_tabularv2 AS five9
RIGHT JOIN Financial_Log_Dataset.Billable_Leads_With_Rates AS leadrate ON
SAFE_CAST(leadrate.leadid AS int64) = SAFE_CAST(five9.lead_id AS int64)
WHERE DATE(earliest_esign) BETWEEN '2025-03-01' AND '2025-03-31'
ORDER BY lead_id DESC, date_and_hour DES
Coming back from lunch. I shall run this in BigQuery to see what you are seeing
Oh okay sounds good. I shall talk to Aidan to see if he can pull the reports for this stuff for the other phone systems
Waiting for it to run now, but I am testing this:
five9.date_and_hour,
five9.bill_time__rounded_,
five9.lead_id,
leadrate.io_call_time_minute_rate,
leaddata.client_firm,
leaddata.old_name
FROM five9_source.five9_bulk_call_data_tabularv2 AS five9
RIGHT JOIN Financial_Log_Dataset.Billable_Leads_With_Rates AS leadrate
ON SAFE_CAST(leadrate.leadid AS int64) = SAFE_CAST(five9.lead_id AS int64)
RIGHT JOIN Financial_Log_Dataset.Billable_Leads_Unique AS leaddata
ON SAFE_CAST(leadrate.leadid AS int64) = leaddata.leadid
WHERE DATE(leaddata.earliest_esign) BETWEEN '2025-03-01' AND '2025-03-31'
ORDER BY five9.lead_id DESC, five9.date_and_hour DES
That way we can split by Client Firm or by campaign for total talk time
Then for the query itself I can see you are doing the CTE method which makes sense
Sorry about that. We can talk privately in the 1:1 but Ahsan really had no place there, especially when he doesn't understand any of what it actually goes to
No worries I understand. Yeah we can talk more about what you sent me at 1pm that works with me 🙌:skintone2:
I got together with Dustin and told him I am ready to pass off my code to them. He said that is okay and I asked him if his team wants to write the script for Luis's email where it will email the firm the discrepancies and he told me he is fine with me doing it since his team doesn't specialize in JavaScript (specifically app scripts). So I shall work on that and afterwards he is going to take the coding stuff I worked on and merge it how he sees fit for SL / integration's team for Nick M
Heck yeah man, great job!
Oh snaps ! That sounds exciting ! :meowattention: I can see the email invite for it thanks ! :catroombaexceptionallyfast:
Hey James. I wanted to know if we can postpone the 1:15 meeting / coding session til I can figure out and finish Ryan's VSS monthly reconcile report for billing for Rose. Ryan stated he wants it around tomorrow morning and I am currently cleaning up the dashboard of Ryan's since the statuses are kind of a mess (Due to agents putting it in a billable status and then changing it to a non-billable status since it wasn't supposed to be in a billable status but Ryan's dashboard already captured it and I have to manually adjust those to remove them along with making sure I have the correct count of finals for each case type, etc).
Not only that, it isn't very clear to me which statuses are "finals / billable" for VSS / secondary for them since I see completed and sent statuses
Please go ahead and throw that in the project chat so other ppl in the meeting + Nick Ward can see it, also please give any kind of ETA on how long that will take.
Hey James. Happy Tuesdays. When you get the chance, I want to talk to you about talk time for five9. That is the only thing that needs to be finished from my end for Ryan so that we can then let Rose the accountant know and she can bill the firm(s)
I want to ensure we are calculating it correctly and I may need to pull Aidan on the side too since you did share your dashboard with me but I want to be on the safe side still by discussing it since I've never had to calculate the talk time minutes b4
Also if possible if we can get this done any point today that would be greatly appreciated since Ryan wants to get together with me and Zek about it tomorrow
Sure thing! I'll swing by later and we can review~
Hey do you want me to try and get you some tech merch while I'm here?
Hey James 🙌 I am not opposed to receiving any gifts 🙏 I hope you guys are having a great time 🙌
I'll see what I can grab you. Stickers at least!
I got some GitHub stickers lol
Sweet ! :catroombaexceptionallyfast: I am looking forward to seeing them ! I may add them to my work laptop or my personal one at home 🙌:skintone_2:
I got you! (Also available in white, I got one for you, one for ward)
Hey James. I wanted to confirm that we are still good for our 1:00 pm?
Hey there, Ward is out of the office right now, but I have something for you to work on when you get the time..
Can you please research into docker and google cloud functions? I am writing up the code that combines monthly expenses and monthly profit and since it's much more intensive I'm doing it in python
I want to make sure we are good to go to have it run on the cloud once it's ready
Hey James ! That makes sense. I have created a few google cloud functions that we successful in automated processes. I am more familiar with gen 1 but I know gen 2 is the newest one so I can do research on that. Do you have anything particular that you want to know about or more interested in seeing example code?
Check out the expensesandprofit.py file in the Newfin github and become familiar with it. If you still aren't added lemme know
That makes sense. I am looking at it as we speak and so far it is making sense to me
from google.cloud import bigquery, secretmanager is my recommendation if you planning on using secret manager
so the issue I'm running into right now, is on my billableleadswithrates dataframe, my 'earliestesign' column is a timestamp, while the monthyearenddatepk is a date datatype and they aren't playing nice together
*Thread Reply:* Yup I had a similar issue too where the data types has to match what is shown in BigQuery schema or else it won't work
If you need something to trigger it I can help you with creating a trigger such as a pub/sub
I need to build some kind of method to turn them both into the same datatype
We also need to go through every column on here and try and figure out what we should actually be adding up for monthly profit.
*Thread Reply:* Monthly profit should be based on what Ryan has on his dashboard for TIP Financials dashboard v1.3 for "Profit X Month".
*Thread Reply:* Alright cool, so if you want to do some coding stuff, were gonna need to start converting our data to those points
*Thread Reply:* 1. Billable Leads (count)
*Thread Reply:* Take a good read through that NewFin documentation I sent you, and make sure you understand how the code is working so far on expensesandprofit
*Thread Reply:* I just updated the code to incclude the date standard btw
*Thread Reply:* Go ahead and make a branch off main, and build some code that determines a count of billable leads per month off of the billableleadswithrates where the esign date matches the date of monthlyexpenses_processed
*Thread Reply:* If you can make it in a seperate python file so we can import it as a function to keep it nice and clean that would be awesome but isn't 100% necessary
*Thread Reply:* That works with me :catroombaexceptionally_fast:
*Thread Reply:* I reviewed the table scheme for the two that you are referring to (billableleadswithrates) and iotipmonthlyexpenses along with the NewFin documentation as well. Along with, I took some valuable notes on how to calculate the director's commission which I shall share that with you along with the spreadsheet that is used for it:
It looks like pandas has a pd.to_datetime() function that we might be able to uise to simplify this whole thing
If you can start digging into how we can have a cloud function read from github, and how we can trigger it, that would be super helpful
*Thread Reply:* Copy that. Would you be open to using a pub/sub to trigger it? That is built in GCP and I have some code already built for triggering it by a pub/sub.
*Thread Reply:* Potentially, or we can schedule it since it's going to be in mainly python. What would be the trigger for it?
*Thread Reply:* Pub/sub is basically a message that will trigger the function. If we were to trigger it through github we will have to use github actions which I was told there is a limit on how many we have a month but pub/sub we can have as much as we want.
*Thread Reply:* Essentially we can define what that message is and when it finds it it will run. But since it is just one function itself and no other functions, the topic (pub/sub) will automatically see that a message is inside the topic and will trigger it
*Thread Reply:* What triggers the pub/sub will be cloud scheduler.
Cloud scheduler --> Topic (pub/sub) --> Pub/sub subscription sees message --> cloud function runs.
*Thread Reply:* We tell what time cloud scheduler runs and then the rest flows.
*Thread Reply:* So essentially when you have the code built I can then help you create the pub/sub topic and subscription and how to set it up on cloud scheduler and the rest will flow easily (for the most part)
*Thread Reply:* Cool that might work well! Thank you
Sorry getting lost in threads- you're right to approach it that way though since were looking at a few different issues
Ultimately we are using python to do some intensive calculations in order to create and update a bigquery table which will reflect the monthly profit?
It can be done in sql but it's gonna be slow and expensive
pythons going to be much better at doing math quickly and efficently.
Hey there~ I updated the code on expensesandprofit.py last night, to be a lot easier to read\
Hey James ! Thank you I will run a fetch to see those updates
Also I am excited to work on a project other than Flatirons ^^
I'm excited to code with you lol
But apparently Joe miss calculated or read or something similar to that nature and it is going to be the same spot
I thought they were swapping the first 2 rows??
Same but they are just moving the medical recorders peeps instead
So I get to keep my spot next by the table that is used for b-day parties and food
Lol I'm happy for you! That's a good spot
If I could make this into a sticker or a shirt or a mascot I would be fulfilled in life
Hey good morning- I dug into the Building "Monthly Expenses Processed" last night and ultimately couldn't find a solution I loved before heading out so I'm going to dig into it a bit this morning along with some other pieces for the Newfin code
That makes sense. Some pieces I am still trying to piece together lol
As of right now I am currently working on the newfin stuff such as mapping out how the field names are similar to each other so that when we have to "rebuild" some of the tables we can do one for one matching as per our previous discussion
*Thread Reply:* So far each tab will be each table name and inside the tab will be the schema and example values
*Thread Reply:* Then I will have a tab that will do the matching part and determine what in common and what isn't
I wrote like 3 or 4 functions last night trying to find something to work but they all felt off or had some unintended consiquence.
*Thread Reply:* That makes sense. Do you think you need more clarity from Ryan on how that column is determined?
The flatirons stuff wasn't hard since majority of it automated so far
Came up with something new while half asleep last night and ran it through AI to check it this morning..
```# ==================================== Building "Monthly Expenses Processed" ========================================
monthlyexpensesprocessed = (iotipmonthlyexpenses.assign(sumofexpenses=lambda df: df.sum(axis=1,numericonly=True) )[['monthyearenddatepk', 'sumofexpenses']] .rename(columns={'monthyearenddatepk': 'monthandyear'})) print(monthlyexpensesprocessed)
AI response: The two snippets do the “same” end‐result—one row per month, one total per row—but under the hood they’re quite different:
.assign() version:
◦ Pandas does the sum once, in a single vectorized operation, no Python‐level loop at all.dropna() vs. sum(..., numeric_only=True)
• Your original used io_tip_monthly_expenses.dropna().sum(axis=1, …).
◦ By default, dropna() with no arguments drops any row that has any NaN in any column. You’ll lose rows you might’ve wanted to keep.
• With df.sum(axis=1, numeric_only=True)
◦ pandas will skip over (not drop) non‑numeric columns and NaNs within each row, so you still get a total even if one of your expense columns was null that month.dropna()..assign(... sum(...)) is:
• More efficient (no repeated work)
• Safer (doesn’t nuke rows with any NaN)
• More idiomatic (1 line vs. 5+ lines of setup + loop)I recall reading groupby works well for month to month cases
I tried groupby last night. I don't remember why but for some reason the AI had some issue with it?
Hmm, if you try the piece of code I have in github where it uses groupby and check it out with AI it should say it goes through each column and row
But ultimately since there are two ways of doing it (based on what I am seeing so far) either way should do the trick
To merge them into main I think we can just adjust the for loop to be what you put or what I put (groupby) then we can copy over the billable leads count lines and I think that should do the trick?
Sorry about that! Im planting seeds to try and get you and I desks that are close by with a tv nearby so we can work together
No worries ! I am down for that. They stated yesterday they weren't going to move my desk and Brittany's team but this morning they did and I was told they may move it again lol
I asked Tony about it and he said it was IT and the only person I know who probably has the power to make those decisions is Joe
thanks for building that out and figuring out secrets manager
You're welcomes James. I created a few cloud functions so I can share some of my knowledge on that
Secret manager, cloud functions, cloud scheduler, gen 1, gen 2, etc that all has it's own documentation and luckily I was able to figure out a system to make them all work so I can share that with you for this project
Okay I had created the pull request for the secret manager. Let me know if I did it correctly and also if you can check the code that would be greatly appreciated
Cool, I emailed ryan, and I'm making some tweaks to call stats, lemme wrap up these few lines rq and I'll check it out
I took main, make a new branch called SecretManagerBranch then I adjusted the code, push the code to GitHub so now have that new branch and then requested a pull request
Sounds good. Once that is approved then I will create a new branch from main and then transfer over the billable_leads code into it
Looks good visually- pulling it down and testing now
Cool looks like it worked
Merged into Master, tested on master and it looks good
Sweet ! I shall now work on adding the other part of the code under neath what we got above
I sent in a new pull request into main. Feel free to review it when you get the chance.
Also once it is merged into master branch feel free to delete these two branches since master will now have everything:
BillableLeadsCount
Edward_Refactor
Looking now
The dataframe we want to reference is billableleadswith_rates. If we just want certain columns we can do that by indexing...
Just throwing notes out there as I review
So earlier in the code, I pull that table, and clean it (turn it to dataframe, standardize datatypes ect)
So it looks like your logic is sound, but what we wanna do is create a new dataframe based off the current dataset we have. Looking for a way to display what I mean now
Redoing the query again and recreating another DF when it was created above
making a tweak and I'll commit it so we can compare the differences
I really like the way you built the logic for this!
Thank you :catroombaexceptionally_fast: & I need to get better at a lot of functions myself haha
thanks for throwing the merge in there too
Here I just commited, check out how I tweaked it. I kept the majority of your logic, but re-used a lot of the code we already built in
Your same branch, that way you can view em side by side
Honestly, it was a pretty simple tweak to get it working with the rest of the code
In retrospect I should change the variable name monthandyear to esignmonthand_year to be more clear
I left a while ago but wanted to provide you an update in person but didn't see you at your office
I pushed a new commit for your review and working on finding the fields that matches each other in that dataset
lol sorry about that, I stepped out for lunch
I'm using that .groupby(), .sum() method you came up with to add the by row revenue per lead
I'm gonna push a commit soon to a new branch if you wanna check my work? I'm a little tired this morning so I need it lol
Sounds good ! I can check. Also did you approve / commit my new branch into main? I added / adjusted just one line
It is for the JSON secret. I updated it where it will look for the latest JSON key always
I don't believe so but let me take a gander
Thanks ! & Works the same. Just in case in the future others download the JSON file it won't affect older pipelines down the road
Let me see if it causes any issues with my merge, if so no worries I'll manually throw it in
I will definitely be keeping that for the future though haha
I mean sometimes that may be more effective for certain cases such as flatirons projects 🤣
Sorry Zek took my attention from 10:40 am to 11:00 am since dashboard wasn't matching and jumped into the meeting from 11:00 am to now so I shall be taking a look at the code in github
When the SL dashboard doesn't match Zek usually comes to me or Esteban to see why that may be the case since SL dashboard doesn't have LR leads #s to it
Hey there, when you say dashboard, which dashboard do you mean? tip v1.3?
Hey James. I was referring to V1.2 however I just checked v1.3 and it is the same
Both dashboards should show that same input. since one is for TIP to see and the other is for SL to see
Also I reviewed the branch you were referring to which is Lead-total_values which I think it is good to push to main.
One thing I would think we should make adjustments would be for the renaming of the function called standardize_dates_to_datetime .It’s actually turning the date into a 'YYYY-MM' string, not keeping it as a datetime, so the name’s a bit misleading. Maybe something like convert_date_to_month_string would be more clear?
dataframe[columnname] = pd.todatetime(dataframe[columnname]) dataframe[columnname] = dataframe[column_name].dt.strftime('%Y-%m') Top line makes it into a datetime data type and bottom line makes it into a string for a date
Good catch, I'm looking into it now. In the first section we do change it to a datetime, but where I initially thought we were just 'cropping' that value, it looks like we are turning it into a string
Yeah and depending on how we have it in BigQuery (the datatype) we may want to keep it as a string or datetime data type
Heyo~ I think I found a way to make those changes to the table names like you mentioned
We can't change the names at the sources because Bigquery limitations BUT..
Hey James ! Yeah there is a sql query you can run to change them
I was gonna say we can change the names in the scheduled queries and re-generate the middleware tables
We just have to be really careful and methodical about how we do this because if we change it at the source first, it can break scheduled queries.
Yes I agree, we should modify the pipeline (coding and queries, etc) prior to changing it or at the same time
Have we tested? I'm seeing conflicting info on the internet
The screenshots I shared with you is what I just did right now
I had chatgpt long time ago and it said no we can't. Then I clicked on "Search internet" and then it said yes.
Esteban is going to school for his data stuff and he stated to me we should be able to change it through ALTER TABLE and he was right
But I think they should update it where we can change it in GUI if we have admin access lol
Now we just gotta audit what tables we actually use, and work backwards lol
Agreed 🙌:skintone2: Also how can I help you with the Ryan's project stuff
I made some tweaks to it I need to update you on, it looks like we are set for this meeting with Ryan in a few and I'm sure we will pick up more tasks there lol
Yeah I am open to give you my feedback and also see how I can help make it easier for you as well 🙌:skintone2:
I may not be on that meeting thought since I checked my calendar and don't have an invite lol
Yeah it looks like it's Mcfadden, Brian and I but I do have a task for you- we will go over it later
Sounds good James. Just came back from lunch so just seeing this now
Taking lunch in a few mins but I'll sync with you when I get back
Let's push back our 1:1 if possible. I ended up getting stuck on the way to lunch so I got out at like 12:45
No worries, my PC was MIA since Joe was installing stuff into my laptop and a driver into my desktop so just seeing this now
I am almost done installing stuff with Joe with my laptop. Sorry for delay
Thank you ! I invited you to the meeting with Darrel just in case you want to participate in learning how they pull custom reports in Litify since I learned today he can create them and if he creates it, I can rely on his report and not the firm since JP is known to change up those reports at a whim
Cool, it may be beneficial so I'll try and join but depending on how friday goes I may have to opt out.
For the project for you, remember how we talked about how we can use the bones of compare_csv.py and use it with the google cloud api to pull data from LR to get a clear, automated report of the discrepancies between the two lists?
I want to work with you on a piece that uses that to build a daily report, so you can see discrepencies daily, adjust what needs to be adjusted likely on the oldfin side, and report the results daily to ryan,mcfadden, ect
Ohhh so build a cloud function that will compare it daily and gives us the output ?
Exactly yeah- and that way you can go in and edit the leads on his end (and maybe ours) as needed to make sure they match up
I see what you mean. I shall see how that can be done since I think what we have already set up may be good already (which is the code piece you made + pulling those reports from ryan's dashboard).
Good morning James. I had received your email about a dashboard request
Cool- This is a flatirons/dicello thing that mal is pulling us in on
Btw, I just wanted to say- You have really been kicking butt lately in not just keeping up with me and shifting projects, but actively working alongside me on the chaos
You're picking it all up quick and should feel proud'
Thank you if I had long hair this is what I would be doing haha 🙌:skintone2:
After a while I get use to the "chaos" but I am liking what we are building and it is making more sense so it is exciting
Hi @James Turner. I am comparing both sheets (Ryan's dashboard) and the query you posted for us to use to compare the difference
I believe my next question is a yes but I shall ask to be in a safe side
If the case type is NOT in the REF table then that can be a reason why it isn't in the query results?
Yes! That is a reason it would not show up
Oh okay I recalled you saying that but had to be on the safe side
I shall go ahead and add those case type(s) since I think the "challenge" of it to be correct always is to always have that REF table up to date
If a case type has a Yes for 'intakebool' and a Test for 'secondarycategory' does that mean it should not show up in the bigquery results?
It means that it was probably set up incorrectly on REF and that column needs to be checked lol
Just added lr casetype id 2016 if you wanna fill out the info on it
Good morning @James Turner Sounds good I'd be happy to add it to the REF sheet
Update I was able to add in that into the REF sheet also I did a pull request for main for NewFin with a tool that I think will be helpful based on what you were asking me to do above when it comes to comparing Ryan's dashboard with the new bigquery financial system
If you have time I would like to share that with you and the reasoning for my approach as well
In short it is very similar to the python one you shared with Esteban and I however I enhanced it and I have an idea on how to probably get it fully automated as well
That's awesome! I'm excited to see it. I'm poking around right now to determine why those 10 leads made it into LR data but not into our system
It looks like they got pulled into Billableleadsunique which is the first step in the pipeline
Thank you :catroombaexceptionally_fast: & Yeah I am a bit stumped. I looked at them very closely such as checking REF table if they are intakes, if they have the case type name on there, if it is a final status, I checked for dates (just in case it was time zone difference) etc so might be some type of logic that is excluding them and/or might be dustin's code due to law ruler having api limitations ? I am not 100% sure
It has to be something in the scheduled query for billableleadswithratesand_lp
Figured it out. They were all getting flagged as test leads because their details contained the word test
I had the test filtering for most columns on because I was worried integrations would put it in weird places.
I thought reporting was the only one who had that issue but that does make sense
lol not quite, it's a lot simpler than that
Just a big ol scheduled query joining tables.
Does this link take you to the scheduled query?
I'll cut them down to just include what's in that screenshot I sent
But looks like it is a SL Org since I get: Access Denied: Table shield-legal-bi:leadspedia.allleads: User does not have permission to query table shield-legal-bi:leadspedia.allleads, or perhaps it does not exist. but otherwise I can still see the query
I shall be going into a short meeting and I think I will be taking lunch afterwards
Let me know if you are able to attend the 1:00 pm meeting with Darrel and I. Pretty much going to see how he creates custom reports in litify and how we can use it for Dicello and/or Flatirons stuff
JP their coder there is known to change up scheduled reports all the time without telling anyone which has broken lots of our codes (some in silent and some not in silent)
Darrel stated he can create custom scheduled reports that won't change which essentially will help keep our codes from breaking. Which is the purpose of it (why I am interested)
I think we should be trying to move away from the scheduled reports in Lawruler, and build out those scheduled reports in GCP as of what Ward said but we can take a look!
Sorry, I'm still booting up this morning haha
No worries haha. Also the issue with just using GCP stuff (dustin's code) is sometimes it isn't always up to date (based on my experience and reporting experience) so since we don't want to look bad infront of the client some scheduled reports mixed with a little bit of python isn't a bad thing
Like we had a time where specific days it didn't capture all the updates for specific leads, etc
To confirm the link you sent me is the "new query" to check against ryan's dashboard and newfin ?
uh no? here the scheduled query name I was referring to:
BillableLeadsWithRatesandleadspediadata SCHEDULED
If the data in the pipeline isn't doing what you think it should be, check that query, as it goes through each data source that is necessary for the end table
If that is the case I will let Ryan know that I may need you or someone who has access to SL project to run this query and send me the results. Reason I say that is because Ryan today posted he would like for me to send this to him daily which it gives me "Access Denied: Table shield-legal-bi:leadspedia.allleads: User does not have permission to query table shield-legal-bi:leadspedia.allleads, or perhaps it does not exist." error since I am not in SL project and I don't think i'd get access to that lol
So basically may need you to run that query for me in the morning and later on I can provide him the update if needed (possible solution)
Well he said after I do S1s and declines in the morning he wants it right away but I will have to tell him it will have to be later on the day lol
So you don't need to run it- the result of the query is run every hour and spit out as tort-intake-professionals.FinancialLogDataset.BillableLeadsWithRatesand_lp
Ahhh okay. I guess I was confused with what query I run to get the results to compare it to Ryan's old dashboard system
Miscommunication from my part sorry. What query do I run for that?
Like the one for tort-intake-professionals.FinancialLogDataset.BillableLeadsWithRatesand_lp. The same one I ran in the morning?
so the query I gave you that pulls the shield legal ones specifically with the date filter
runs on the tort-intake-professionals.FinancialLogDataset.BillableLeadsWithRatesand_lp table
Gotcha. I found what I was referring to. I was asking if I still run this to get the results to compare it with dashboard:
Just want to make sure this didn't change since I know you updated some SQL things
Ahh gotcha, Yes that is still what you run lol
I just needed to edit the source table for that so it included the proper tests
Oh okay perfect. I was afraid that might of changed and now I have to run that other query that only SL individuals can run lol
I am impressed and can't wait til we polish / enhance the last parts of it
Nah- Honestly I want to change the name of tort-intake-professionals.FinancialLogDataset.BillableLeadsWithRatesand_lp because it was never meant to become the defacto where everything is pulled from
but changing the name now would effect ryans pipelines too much
Yeah I see what you mean. We can see with him what pipelines are using it and should be a simple change (I am hoping) since to have field names and table names, etc consistency will long term be better
Hey there, won';t be able to make it to that meeting but please share any notes you get if it seems relevant. I got pulled into other things
Meeting Notes:
• Litify Field Names:
• Litify uses two types of "field names":
a. A display field name — this can be changed and is visible to all users.
b. A UI (User Interface) field name — this is a unique ID tied to the field that does not change, similar to how Law Ruler assigns an ID to each case type.
• Best Practice Reminder:
• Darrel mentioned it is best practice to only update the display name that users see, without creating a new field (which would generate a new UI ID).
• Historically, D.L. hasn't always followed this practice, so Darrel advised us to be mindful when making field name changes going forward.
• Custom Reports from Darrel:
• Darrel has the ability to create and schedule custom reports in Litify. However, only individuals with Litify access (currently Darrel and MRT) can directly receive these reports.
• To work around this, Darrel schedules the report to send to himself and then uses Google Forward to automatically forward the email to specific recipients.
• How JP Processes Reports:
• Darrel explained that JP likely receives the scheduled reports, uses Python to combine the separate .csv files into a single .xlsx workbook, creating:
◦ Separate tabs for each individual report
• An "All" tab that consolidates everything
◦ This process results in many duplicate records, which explains the duplicates we currently have to filter out in our code.
• Next Steps (per Brittany):
• Brittany emphasized that we need to request Darrel to only send Flatirons case types moving forward, as that is what Abe agreed upon.
• We are scheduling a follow-up meeting with Darrel on Monday to finalize the best method for ensuring that we only receive the necessary data, without any extra/unneeded information.
Also the other day you sent me a google sheets doc for Doug's project. I went ahead and created a flow chart to help assist as well and shared that with you. I am not sure where is everyone in this process but I do want to help in any areas that I can
Thank you so much!
In reference to the google sheetr doc for dougs project, I got pulled in on it, but it's primary a brittany and Mcfadden task so they know more than I do
You're welcome. If that is the case then I shall put a pause to it and discuss it with them since I don't wanna overstep my boundaries with it.
Also I sent you an optional invite for the next meeting with Darrel. This one is just about receiving specific case types fields only.
I'm, working on figuring out why 717873 isn't showing up and it's weird..
717287 showed up after a little bit which makes me think it got caught oddly in scheduled queries
but 873 is still not, despite showing up in other things
Yeah I thought the same thing tbh. Since everything else you showed me I checked
Good afternoon James. I wanted to see if we are still good for today for our 1:00 pm
Hey there, I moved it to 1:30 but otherwise we're good!
*Thread Reply:* Available whenever you are. I am in the google meet 🙌:skintone2:
I love that ryan started using :fastparrot:
Yeah I feel like everyone may be serious all the time so these emoji's (in my opinion) makes me more relaxed (in a way) lol
I think you're right. it's difficult but staying calm through the chaos makes it so much more doable
Hey there~ Let's scratch the 1:1 today but if you can, please send me an update with any progress on getting that postgresql and bigquery data into sheets
Hey James ! Thanks for the info. I was about to ask about the 1:1. As for the progress, as per our previous conversation with you and I, let me know if this is still true.
I had talked to Dustin on how to get that table (Ryan's table that he uses for the dashboard) into BigQuery and Dustin stated it requires setting up a different connection (That requires Ryan's input) and the data is going through AWS which is done through lambdas, etc. Long story short, he (Dustin) stated it isn't a simple process to do; therefore, I thought we both agreed that a simple copy and paste into the spreadsheets would be most efficient for the use of our time. Is this still true?
Sorry I think I misunderstood before, as long as you're okay with doing the check each day, that's totaly fine
The reason I nudged tword code is so you can become familiar with more API connections, and linking data from multiple API connections, but ultimately as long as it gets done, it's all good
By the way, I'm finally getting the PCTID into retool and wrapping my head around it
I appreciate you checking up on me on this 🙌:skintone2: Yeah that makes sense and I appreciate you looking out for me to obtain more experience :laptopparrot: I am up to look at that retool stuff if you want some feedback or just need someone to test it out, etc :catroombaexceptionallyfast:
Yeah from what I am seeing the dashboard (one fin and new fin) is pretty much matching with the except of that "weird bug" that fixes itself later on
I may have access to it but don't recall the link for it and possibly the credentials that goes with it.
It doesn't fully work yet, I'm sorting out the bugs on it while creating it but I am moving things over to a tag / array system so when you go to enter a law firm, it will display entrees that are already there and you can just select them.
That makes sense ! I don't think I created credentials for it. Did you sign in through your email? && I can take a look at it once you more things polished if you think things are not ready yet to be viewed?
I did sign in through my shield email. I wouldn't show it to anyone else yet, but since you had such a big part of pctid_v1
i want you to see how v2 is developing lol
Oh okay if that is the case let me see if I can sign up / sign in through my email 😬
Might be because I just signed up through my email (work email) or maybe from your end
Cool it's under personalized Casetype id
Hey there, do you have any idea what table Ryan is talking about?
He shares so many queries I don't recall using that one. But after looking into SL project I can see this: shield-legal-bi.sldashboard.orderfinancials I would go with field name lpcontractids (my best guess) since you are looking for leadspedia ids
Yeah I'm looking for leadspedia contract ID's and the Lawruler campaign names so I can tie em together
Im sadly not seeing anything with the lawruler data though
I think that one I sent you all of them (live). I think for a full list I think this table may be better: orders_list
Closest I'm seeing is that orders_list so far, but the names on campaigns changed so most don't work anymore
in orderfinancials table (sldashboard dataset) it says: cprlpcontract_name
Each campaign has it's own contract name (I believe a few many share 1-2) so that would be closest to primary key and passing over "lpcontractids"
I could of sworn each campaign only has 1 contract name though so that may be the unique identifier you can use
I think what we are looking for is "name"
but they never updated it to include the right lawruler names after they change them
I feel like there might be more fields we can pull from leadspedia that can match with that may not be on that SL table such as case description # which can be used as a one for one match with our tip database
My guess for this is we gonna need to match for one thing then match for another thing then pull data from both tables into one
So match with one to find another unique identifier to match with and use that to match again and then we can pull the data for leadpedia ID (My guess)
Or may have to have someone manually update something (such as a spreadsheet) that has case type (law ruler name) match with leadspedia ID # and pull from there (but sadly it is a manual thing) (worst case scenario)
I think you're right. Probably going to have to pull all of the leadspedia ID's from my billable leads tables, pull all the contracts from the leads, and write the leads to the contracts that way
we might miss a few contracts if they never got a billable lead but ehhh
Yeah or we have to create our own "primary key" by using a google sheets which I would like to avoid lol
Or may need to see what fields leadspedia has and see if we can pull more
I gave you access. Go to the Primary Casetype ID tab
Once we get the Leadspedia contract ID's in we can move everything over to this table instead of the excel one
Try editing it- It uses entrees that were already entered to give suggestions
My Billableleadswithratesandlp already has the contractids
I'm sure there's still errors floating around, see what you can find quickly (I also used some machine learning stuff to normalize the data a little further)
If people want to add tags like tv or flyer or timbukto or whatever
they can add all the tags they want as an array system
Interesting. So it looks like not every campaign actually has billable leads with a contract associated
a lot to, but not all of them which is odd.
Hell yeah it worked! A new campaign got created in Lawruler and it automagicly pulled it in!
Sorry I shall check / test it out shortly. In a meeting to talk about outlining the entire flatirons process to ensure all departments are doing what they are supposed to be doing including having an outline process to send back to Abe from Flatirons
Do what you gotta do, I'm just excited lol
I think may be easier if I were to pop by so we can test those together since I am not familiar with how to use retool tbh haha
Let me know when you are available James I can pop by :catroombaexceptionally_fast:
I am OOO for the day, but it works out because I want you to watch something and let me know if it makes sense..
Go to Primary Casetype ID tab, and scroll down to the bottom
Notepad me: Please like, subscribe and share will all your friends and family
I wonder if we can include "links" to it. Since if that's the case we can use "loom" and that can also be a screen recorder if they want to hear "audio" version of it. But essentially looks nice
It's going to be a small team of people that use this, but I needed to make a tutorial for mal anyway so i figured throw it on here
More importantly though, do you feel like you could use this tool after watching it? At least to update case types as they come in
Let me watch the video first to give you my Qs (if there are any) if that is the case
I am assuming you have the data connected to gcp as a table right
I processed data out of the excel one, cleaned it up quite a bit, and then imported it column by column\
Gotcha and a new case type comes in then fields will be blank except the "lawruler case type name"
What if there is an option in the dropdown that doesn't match with anything. (Such as a new firm or a new tort type) we would need to update REF table?
Having dropdown makes sense so that we don't select something that has "extra spaces" or extra "-" etc
Nah you just type it in, and select what you just added from the dropdown
Ah then in the future that new field value has been "created"
I wasn't sure if that needed to be added or not, so I appreciate you asking
What you showed makes sense to me (above) and in the retool website. In my opinion, I would probably just show how to select things and what to select in order to give us the info that we need and maybe which fields are optional.
Then have a separate video that goes over "common questions and good to know" that way one video is straight to the point showing how to use it (which once we know how we can skip through it / fast forward) and the next part of it covered "what is needed" for the pipeline
Not sure if you can include "more videos" or "links" to more videos but my thoughts on that
I know you recorded it just to get the process going but if you were to shoot a "new video" I think that is what I would do
I appreciate the insight- I think you are right!
Essentially you can do everything as "one video" and have "different sections" and "time stamps" too. Another idea
1-2 min mark = how to use it; 2-5 min is what is required and 5-7 min mark is FAQ
But essentially makes sense, easy to work with and I think long run it will work well
But I like it great job on this. Looks great :meow_attention:
Thanks man, actually could not have done it without you contributing so much to the v1 on excel..
and we aren't done yet. We gotta start moving our pipelines over to read from pctidv2 instead of pctidfrom_excel
That makes sense. I am glad to be of help and let me know if you still need more help with it. Since from my understanding once we get that S1 stuff done and merge it with what we have in the bigquery table essentially it will be that "one big table" that ryan's wanted since day 1
Hey there, what are the 4 leads that are not showing up?
Hey James. Should be the ones that are not Yellow highlighted rows
Testing it but I /think/ I got the timezone stuff figured out
Heyo, I'll swing by to you in a few minutes
Hey James. Just wrapping up some financial stuff for flatirons. Will be available once complete. Sorry for the delay
I found what may have been causing some of the bugs and I think I fixed it.
When you get back, let's run the numbers again for today and see what we find
Sounds good ! Going to take a quick lunch since time has been flying fast and haven't gotten the chance too
Thank you 🙌 Ahsan is leaving next week so I gotta take all the notes I can about the financial stuff
So I was looking through the data and I realized that despite declaring the data as pacific time, it was presenting it as UTC because that is the industry standard
I wonder if it would be better to just do 1 for 1 comparison where it is 2 days from current date compared to yesterday so that we know it should match regardless of the UTC or PST, etc
So we were feeding 5pm pacific time, and to keep ryans end from having to mess with timezones, we were doing the time zone call thing, which was skewing it, but still outputting UTC data.
Since when we do financials, we are not doing it on 2nd or 3rd but probably the 7th or 8th lol
So I modified it to save as Pacific time data, but it will still spit out as UTC unless called otherwise
Because the query "Shield Legal - Revenue Per Lead Query - Date Selectable" calls for a time zone conversion in where clause and not the select cause it pulls the right window of time
I can tweak it to include a new column called "earliestesignpacific_time"
So in your queries to count up the day- you'll still be solid using that scheduled query and column field
but know that the time is now correct, but presented in UTC
That makes sense. I think that thing that makes it "tricky" is that Ryan wants me to post first thing in the morning. Which it takes time for those "missing leads" to kick in so it will always be off in the morning since they are not in the database at all until later. So that's where the discrepancies kicks in. Another alternative it to push the "end date" to be 2 days ago instead of yesterday which by then all the leads should of been pushed through which wouldn't be an issue for the financials since we don't submit anything 7-10th based on my experience
No changes needed for that post first thing in the morning as long as you use that scheduled query ( Shield Legal - Revenue Per Lead Query - Date Selectable ) to get the leads for any given time window.
That query has the time zone conversion to PST within the where statement so it makes sure that the time window it's searching for is accurate.
So in that statement, if I'm saying give me everything on May 4th PST in that where statement..
Even if it happened at 5am May 5th UTC.. It will still show up, because thats the correct window.
Make sense?
Like that call states the 4th in Pacific time
I have a question for you. I believe we had a sheet that shows all "final statuses" ? I am trying to find where that is lol
Hey there~
Not sure about the sheet, but the status table on the database has a boolian value for if a status is terminal
tort-intake-professionals.tipprodapplication.iotiplrstatusrates
Has every status in Lawruler, and if it's a 'final status' terminal should be true
Gotcha yeah I would figure there is a table or sheet for it since the table that pulls the financials only uses "finals"
Ohh- you're looking for billable status's, not the "final status" that a lead ends up in
Ah I see ! that is good to know ! Are there any other tables or sheets that also updates those billable statuses that you are aware of?
Hmm you know what, I think that table should be good enough. Thank you @James Turner
What do you mean also updates those billable status's?
My query if you need it but essentially just knowing which ones are billable are good
Like other tables that determine if a status is billable?
Sorry I had a typo; I was referring if you knew of any tables that also displays all statuses that are billable
But this one if it is being updated then should be good enough
Currently working on the sql joining stuff but posting for your and mine reference
Heyo~ With you moving to the shield side and going fully into flatirons, what do you think about canceling our 1:1 and making it a as needed thing? I'm still always happy to help, I just know we are both busy now haha
Hey James 🙌 I am down for that. Once I get the flatirons in a nicer place then I think being able to meet just to all be on the same page about things may be a good thing to have 😬
Sounds good!
Also I am thinking of creating a sticker / t-shirt website with a friend of mine. I wanted to pick your brain of ideas (during your free time) you think may make good content? lol
Also not sure if you liked any of the ones I showed you today
Lol I really did like those, those are really cool!
For sure! I used to draw with alchol markers and get the big clear packing tape and make my own stickers
I liked the select ** from feelings WHERE reason IS null
actually- was that WHERE reason = NULL or IS NULL?
I liked that one lol
If you do sell em, I think it would be SELECT ** from feelings WHERE reason IS NULL
but that's just be being really dorky about it lol
I wanted to reach out to you to see if you can reach out to Rose for this:
Hey good morning! I would be happy to reach out to Rose, but I need a comprehensive explanation of how he would like it done on the 'other revenue' page. Is Rose ONLY adding up revenue from staffing? What does he mean by " new system operating expenses and bee other revenue" ?
Yeah honestly I am not 100% sure what he means by that and was hoping you may know what he is referring too lol
Good morning James. Happy Tuesday. Were you able to get more clarity on this above / know what to do for Rose part? As a friendly check
You're good, I let her know, and ryan sent a message about what to put on there in the chat
Oh okay perfect. Yeah it is basically three spreadsheets that has $ values on there and she has to add May's #s on there
Take from TIP to give to SL = profitable is what I am hearing
Smaller companies (less employees) generally have more tax forgiveness due to 'small buisness'
so move all the profit from tip to shield to bring home more money.
Also I confirmed with him twice knowing my question is a yes so that when I do it and if something goes wrong it isn't my fault completely since i followed and confirmed twice lol
Yup plus I keep a .csv backup before the changes and I confirm it before updating and post the updated .csv as well. Since there is no "backup" I gotta create a temp one lol
I'm doing the same thing but a little tweaked. I just created snapshots of the rev rates and rev rates history tables so in case I nuke it, I have backups
Are you understanding the "calculation" parts of how gross revenue and net revenue is calculated?
I pretty much use a google spreadsheet he shares with me to calculate that so I wonder if I just need to zero out the formula for those
Honestly not really- That's what I kept trying to ask Ryan about but he kept interrupting me.
On my new system, I know that I need to go into my iolrcasetypesrevenuerates table, and update all rows to existing case types to have an iocalltimeflatfee of 0 for may going forward. The way these tables work, is iolrcasetypesrevenuerates is the "current" pricing of everything, while the tort-intake-professionals.tipprodapplication.iolrcasetypesrevenuerateshistory is the price of each item, with multiple rows that cover different time windows depending on when the lead became billable
I feel like if I ask him a question he is going to yell or get frustrated when I am trying to do a task to make his life easier since I was told he wanted me & zek to help him so that one day he can take a vacation where he doesn't have to work & it is kinda hard to do that if I don't understand it exactly since I didn't build it lol
Correction- all of the campaigns where the iocasetypecategory is ret
and in your case, I believe it's all campaigns where the iocasetypecategory is RET, and the marketing source for the campaign is also shield legal
Yeah that makes sense. I just trying to figure out how to calculate gross revenue and net revenue
I think I will ask Ryan if I zero out talk time on his spreadsheet if it will give me the answer I need
another correction, I meant iotipservice_type not case type category
Ahhh gotcha. thank you. And to confirm the five9 table (I can get the name if you need me too) but where it has all the talk time history that one isn't going away?
I ask since I think you posted earlier ago about updating to a new table
We are keeping all of the storing of talk time as is, just changing the cost from $2 flat rate to 0
Thank you James. That is the exact post I was looking at and still confused
When I asked him that he said the other two fields needs to be recalculated
I do not understand how those are calculated since he does that for me when I simply input the data
I run a query to get the values and I paste them and I get the answer so I am confirming with him if me changing the formula for talk time 0 will give me the proper output.
But what is confusing me is when I ask him that he talked about a different bigquery SQL code I run which now makes no sense lol
I think I will simply remove the 2 calculation from it and I will simply send him the before and after changes and ask if it looks correct to him and if not I can reupdate it later on.
Hey there' I talked to Ward and he said you needed a little extra help? I don't have access to the PG financial log but after stepping away for a few minutes, I think I understand what He's asking you to do
I just confirming with him to be on the same page and once we get that info (price + case types names) it will be easy change
Note that nobody ever actually told me what campaigns specifically, just that super vague name and 'filter' so I went through every campaign with those phrases in the name and did what was NOT a secondary, and what already had pricing to indicate it was active
Yeah that is what I would of done too. I appreciate you finding those and getting the answer for pricing 🙏
Ryan ended up giving the pricing. I messaged tony 3 times, and finally talked to him in person where he said he would send it
I waited an hour, got no response, told ryan and he came up with the 216
Ahhh I see but the ones you posted should be correct for me to follow right
If not, we have the record of what to change/add, that's why I threw it in there
Yes that makes sense. He will let us know if it needs to get updated again haha
You all good going back and updating all those to 196?
Cool, glad to hear it!
Wanted to let you know Ryan gave you some big shout outs this morning on our meeting
He seemed super happy with the work you did while he was gone
Aw thank you for letting me know ! I appreciate being appreciated haha :meow_attention:
Hey there, do you know anything about this new 'Outbound' Call type? It looks like Tony's team may have made a new 'call_type' in Five9 and started using it the day that Ward went on Vacation.
> Can we please make a looker enterprise account for Anthony Sobo and Christian Rodgers
Joking. What type of access they need ? Viewer and editor or ?
Okay sent one out to Christian so he will receive email to get his login stuff
Oh shoot- for some reason it never sent the slack notification
Oh no worries. I think he rescheduled it for a different day
Hmm let me check. I shall screenshot you what options I am allow to give him and what he has
*Thread Reply:* Hey there, I am here with Ward and Carter
So far it's mostly just people who haven't read the doc, talking about things that are explained later in the doc
Ah gotcha. That does make sense haha. Thank you for the update James on this
And Ryan wanting to now add 2 more pieces of software to be replaced (Copper and PandaDocs)
That is a good catch too since I overheard they wanting to build this as one unified system. I think we need majority of the systems off this system and we can ping from postgres / GCP
Data Set Update: • contactdob and contactcellphone from the Lawruler Contact Card has been added to the alllawrulerleadswithenrichment, and 'leads with five9' datasets
Woot woot great job @James Turner and once again thank you for your assistance with this as well 🙏
For sure!
Hey James. I wanted to give you an update on the LookML issue I had. I manually did one at a time and not create the model and view both and then try to commit and that helped.
Appreciate
Hi James. Can you let Ward know about the situation since you guys are in the same office? Ryan tagged me to let ward and tony know but if you are the same office as him you can probably explain it better than I can if that is okay
I'm confused where people are getting a billable date of yesterday? These got signed back in 2022..
It is because law ruler changes the sign e-sign date based on the latest status change date
Since these were never in the tip system (my guess) that is why it is showing yesterday date
If they were in the system then the date shouldn't have changed
(( Wait- Is it still okay to make "Hell yeah Brother" jokes since he passed away?? ))
'iscurrentlybillable' and 'billable_date'is based on a few factors.
What it looks at, is the current status of the lead, and if that current status is a TRUE for 'billable' on the iostatusrates table.
To determine the 'billable date' of the lead, it first finds the earliest instance of that lead becoming e-signed and uses that timestamp for the billable date. IF the lead is somehow billable, but never got an e-signed status (Think secondaries, or outsourced leads) it goes by the date that the earliest 'billable' status was set. IF the lead is currently in a billable status, but we have no record of when it got into that status, then the billable date becomes the lead creation timestamp in Lawruler (Very rare, but it does happen on leads that were copied from a billable lead and the previous lead was deleted)
Little complex sounding at first, but really it's pretty simple, and just has a couple of fail safes built in
Yeah I recall you mentioning this and I will save this as a note
I have one last one actually that I need your assitance on finding.
Sure thing- What is all the info we have on the lead?
Potential campaigns could be:
1,842 - CPAP Settlement Package - DL - Flatirons 643 - CPAP - Flatirons CSP Medical Records - Dicello Levitt 598 - CPAP - DL - Flatirons - Shield Legal
That makes sense, I was wondering why we were looking at such an old campaign
Give me one moment I think I might be okay sorry my mind is all over the place with my notes since I have to explain what is going on to Nancy and explain it a different way for zek to update and notes for myself too lol
Give me about 5 minutes to confirm if I need another search if that is okay
You're all good- I can hang out a little late tonight til like 5:15ish if needed
Thank you James. I will message you in 5-ish minutes if needed
I gotta head on out but here is that query I used with comments so it's easy to re-use
WITH status_log AS (
SELECT **
FROM `tort-intake-professionals.lr_data.lead_history_status`
WHERE DATE(`date`) BETWEEN DATE '2025-04-28' AND DATE '2025-05-04' -- Edit date range here
),
case_type_info AS (
SELECT
lawruler_lead_id,
lawruler_casetype_id
FROM `tort-intake-professionals.tip_ops_dashboards.All_LawRuler_Leads_With_Enrichment`
WHERE lawruler_casetype_id = "1879") -- Edit Campaign ID here
SELECT
s.**,
c.**
FROM status_log AS s
JOIN case_type_info AS c
ON s.leadid = c.lawruler_lead_id;
I think I am fine for now but if needed I will use that code 🙂
Hi James. I think I am good now. I sent my email to them and I let them know I will have those updated tomorrow (zek isn't here anymore to update those). Thank you once again for your help on this
Happy to help! Lemme know if you need help finding anything else
Good morning! Sorting through the shared folders in Looker enterprise and I see we have three different folders for Flatirons. Is there any way we can consolidate all of that into a single root folder? We can still add Subfolders as needed
Good Morning James ! Currently I am still thinking about how to go about that for Flatirons. I like separating certain folders for specific coding so that I don't mix them together but I am open to your ideas as well
But yeah I didn't think of subfolders so I think that part may just work
This covers how the structure for each primary firm should be set up, that way we can do permissions at the folder level
The Dev order summary example I am assuming that one is ryans but I think that folder name does sound good
That looks correct to me for the second screenshot I agree it is based on the folder level
And here is how we set up the permissions... I'm making roles and group blueprints right now.
Let me know if you can think of anything else before I start implementing
So permissions for things like being able to explore or save a dataset is given at the role level, but you can group individual users into a 'group' for easy adding
That does make a lot of sense ! I think it actually looks very organize and I am impressed
Only thing I would comment, as far as "reporting team" if you are referring to Brittany's team, they do not handle financials like us but if you are mentioning "report team" as a general thing then I would say yes to that
Hey James. I wanted to pick your brain. Is it possible the "app" aka retool that mal uses can eventually have these features or do you think it is a limitation of retool in general ?
It might be a little tricky to implement, but I don't see any reason it couldn't. Depends what shes trying to bulk upload, and select multiple of
By the way, check this out-
```# Within a lookml .view file:
dimension: allfirmsarrayraw { hidden: yes sql: ${TABLE}.allfirms_array ;; }
dimension: clientcodematch { type: string group_label: "Access / Ownership" description: "Row-level access: equals the user's client code if present, or 'ALL' for override users."
sql:
{% assign cc = _user_attributes["client_code"] %}
{% if cc == "ALL" %}
'ALL'
{% else %}
(SELECT ANY_VALUE(f)
FROM UNNEST(${all_firms_array_raw}) AS f
WHERE f = {{ cc | sql_quote }})
{% endif %} ;;
}``` If I set this up right, we can make it so we can set up row level permissions on models, where if their law firm exists within all_firms, it gives them permission to see that row..
So when we set up user profiles.. we set their client_code For internal people like Marc, we set as "ALL", for the company external people, we just set the client code to whatever the company name is in PCTID
That looks pretty cool ! I don't know the coding to it unless I see it on action but what you are saying makes sense
I haven't tested it yet, so I can't guarantee it works yet lol,
Are you available for me to head over there for a few minutes and test something with you??
cool, I'll head over in a few- I appreciate
Hey there, i am attempting to build a map like you have for the abuse states, but I have over 5k leads.. I remember you had a really cool summary measure thing of how many leads per zipcode, how did you set that up?
Hey James. Are you trying to build a map where it looks at what city they are in?
I was gonna go by zip code but anything like that yeah
If so I think a better approach would be to go for states since it is more simpler. I had to manually look up the longitude and latitude for each city through ML and then input that into a table. Ideally it isn't scalable unless there is some type of table out there that gives us those things where we can pull from through an API then I think yes
Hmm dang- Yeah that's not really scalable to realtime without heavy backend work..
Even if I go for state, I'm still going to have 1 row per lead which will only give us the first 5000 leads max
How did you get the interactable map? was that googlemaps visualization?
Yeah it would require backend work which make take some time. Yeah that is true about the 5k limit It is one of looker enterprises visualization the map part where it is interactable by default when they click on it
I will look into how to get that data though since I think having the city level looks pretty cool but I think that may just be an overkill. Maybe just do 5k and anything that doesn't show up may not be so significant. Ideally how they like me to do the dashboards is one dashboard per a campaign (each campaign includes intake and secondary) so with that you won't have to worry about the 5k limit for most campaigns
I set it up in a way that asks for and supports selecting a broad tort type or campaign first and I think that works pretty well
It's looking good so far! Great job on this :meow_attention:
Thank you- It's taken HEAVY inspo from your mormon page
From one inspiration to another inspiration is sometimes the way of the warrior 🥷:skintone2:
Hey there- is the information that feeds into ryans 1.3 in Looker enterprise? I want to build something out to compare new and old systems
I know the actual dashboard is- for info I'm trying to find a way to reconsile between the old financial systems and newfin. I saw July closed yesterday with around 3545, but my newfin system said 3575 so I wanted to find the differences
I see. My guess is you want to find the tables associate with it so that you can compare it against the newfin with a code to spit out the differences huh
Yeah- I want to find the data that shows the leads deemed billable in the old system\
Yeah I would make a copy of the dashboard and play around with the internal table settings is my suggestion since I am not too familiar with where he is pulling it from
Hey there' I need access to Ryans postgres so I can start doing data verification- Do you know how to set up that connection in Pycharm, and what table the finalized leads are?
I believe he sent us creditials for it a while back I would probably do a slack search for pgadmim or creditials
Hi @James Turner. May I ask for your associate when it comes to finding some leads where it went from s&d status to any other status between specific campaigns?
I have a deadline to send to cameron for flatirons and currently having some struggles
Happy to help where I can, just to make sure I understand- you're looking for all leads that were "Signed and Declined" and went into a different status?
WITH statuslog AS (
SELECT **
FROM <a href="http://tort-intake-professionals.lr">tort-intake-professionals.lr</a>_data.lead_history_status
WHERE DATE(date) BETWEEN DATE '2024-06-10' AND DATE '2024-06-16' -- Edit date range here
),
casetypeinfo AS (
SELECT
lawrulerleadid,
lawrulercasetypeid
FROM tort-intake-professionals.tip_ops_dashboards.All_LawRuler_Leads_With_Enrichment
WHERE lawrulercasetype_id = "605"
), -- Edit Campaign ID here
enrichedleaddata AS (
SELECT
iscurrentlybillable,
lawrulerleadid
FROM tort-intake-professionals.tip_ops_dashboards.All_LawRuler_Leads_With_Enrichment
)
SELECT
s.,
c.,
en.iscurrentlybillable
FROM status_log AS s
JOIN casetypeinfo AS c ON s.leadid = c.lawrulerleadid JOIN enrichedleaddata AS en ON c.lawrulerleadid = en.lawrulerleadid --WHERE fromstatus LIKE "%DEC%" --AND WHERE iscurrentlybillable = TRUE