@Brian Hirst Do you know if we are using the "Case Id" field in LawRuler for anything?
Which field is CaseID? I'm not seeing it. To answer your question, I don't use it for anything. I'll need to know which field you're talking about for sure to say yes or no.
Ok I think it’s actually hidden from the normal dashboards so that’s good
We’re looking for a field not being used
If you have a din radio you can use this. Hella easy to install https://www.amazon.com/Touchscreen-Wireless-Multimedia-Navigation-Bluetooth/dp/B0BLH2HLGF/ref=ascdfB0BLH2HLGF/?tag=hyprod-20&linkCode=df0&hvadid=647168045652&hvpos=&hvnetw=g&hvrand=17021171431291719540&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9031316&hvtargid=pla-1967566067721&psc=1&mcid=b2e5157b0aa43a64aff4e2daa8516964|https://www.amazon.com/Touchscreen-Wireless-Multimedia-Navigation-Bluetooth/dp/B0BLH[…]pla-1967566067721&psc=1&mcid=b2e5157b0aa43a64aff4e2daa8516964
I’ll definitely see if this works! Thanks!
Hey Nick Are you free tomorrow morning? I want to get everything in order for what needs to be done by who. I've got a lot of work ahead of me making all these custom JSONs for each contract so I want to be able to just change them all one more time and not have to go back and do it again
Whatever time you get in and have an hour to jump on with me and James to go through it
Let’s do it early. I got a lot on my plate and need to knock this out
maybe closer to 6:45_7:45 😂 I'll slack you after I grab my coffee and am on my way.
Sounds good James probably needs to be on part of it too
Sounds good! Closer to 730 when I get back from the gym
Currently 131 live contracts in LP It has taken about 25 minutes to set up and flip each one from Vici to Connex once I am in a working rhythm and not interrupted. I've done 1/2 but I need to go back and amend all of them with the new fields. So about 12 minutes per contract for the ones that are already made so that's about 12 hours, Then the second half of the new automations at 27 hours of work. Moving forward this will also add ~10 minutes to new contracts.
When I talk to him I’ll let you know
I wasn’t holding my breathe haha
Would you be able to hop on a call with the ConnexOne guys and I about the current test and with Zapier?
Hey I’m mobile right now Just got lunch and am at a coffee shop working. Contact Mark Maniora about the zapper testing He got all the necessities
Ok are you free tomorrow morning?
Hey try seeing if you have the Shield project now
I can check in a bit I’m in security at the airport
Hey Nick, Ryan told me to tell you we're on
I'm having a brain fart Can you tell me why this query isnt working
SELECT **
FROM tort-intake-professionals.leadspedia_all_contracts.leadspedia_all_contracts
WHERE contractnamex
LIKE '%Wood%'
Well it works but it is giving me no data
TIP is what everything is running on, isn’t it
Well in that case I can go back take a look at that one but worst case we shift to the SL project
My apps run off there but the tables are all good.
Ok, so I ran that query in Shield BI and it is missing these 2 contracts
Ok so interestingly those were the ones that Mal said she was missing yesterday too
The only thing I can think of is if any of the IDs are off vertical, buyer, contract, etc. it would potentially drop it because it can't find a match
I forced another sync for the TIP project. Currently it is scheduled to update every 6 hours. It says it ran once this morning but running it again just in case
When I pull just the contracts report I get only 9 contracts
These are the 9 contracts that it has to merge to
Do you have a bigquery table for "leads post log" from leadspedia
Hey sorry that I am just getting to this but what do you mean by leads post log? Like just a table of all of the leads in LP? If so, then yes.
If you have access to the SL BQ then its in the leadspedia dataset and you can use either the allleads or allsold_leads tables
https://console.cloud.google.com/bigquery?ws=!1m5!1m4!4m3!1sshield-legal-bi!2sleadspedia!3sall_leads
It's not sold leads or all leads. It's the "Leads Post Log," specifically I'm looking to match Lead ID from that log with the message token of "lead already exists" to the corresponding lead in "all leads" that have the response of "no match"
In LP if you go Leads>logs>leads posts log you'll find it
Ryan said it would be in a bigquery table
Ok I will look for what youre talking about and let you know
I am going to put a ticket in with LP to see if they can quickly point me in the right direction
Hey figured it out and made the this table for it: https://console.cloud.google.com/bigquery?ws=!1m5!1m4!4m3!1sshield-legal-bi!2sleadspedia!3sposts_log
@Brian Hirst can you check the data that I pulled vs what you expect there to be? It oddly pulled exactly 247 records for each month which seems unlikely for it to be exactly the same and that low
Hey I figured out it was just an issue with the code
I thought there would be more than 1k records so it was pulling the same set everytime
its fixed and has the 265 records that are showing in LP
please review the table now and let me know if this is what you needed
I’m not sure if it’s what I need yet. I still trying to figure out how I’m going to try and solve this
this endpoint seems to be different than all the others. You can only pull by DAY so if we want the logs since 2019 I have to iterate through everyday plus paginate if over 1K.
I will update the function to at least pull everyday rn for you to test with but we will have to make some new code to handle this
Well right now it will update and then drop it every day until we can get it fixed or I can just leave it as a specific day and then I can manually update/change the date when you need in the meantime
Hey I worked on the code and I am pulling in everything since the beginning of LP. Have to iterate through every day since 6-1-2019 so it is going to take a minute for it to pull everything in.
Once I have done that I will switch it to just adding each day but it will try to do it every hour by appending the new information. I will then need to build in a deduplication of some sort to remove the multiple iterations for the day. I have added a load_timestamp column so in the interim your SQL query will need to deduplicate off of the latest value in that day.
It just needs to be moving forward Not historic
Lol well too bad you have everything now
We can now work on the daily update portion
how often will you have to do this match up? What is it for exactly? The reason I ask is if I were to make a script that at 12:01 deletes all but the latest record for that day will that be okay? During the day you will have to dedup it yourself but it will be clean the next day.
Hahahaha So some leads are “no match” in LP bc they are dupes in LR “Duplicate” is reserved for LP duplicates only
Ok will have to include the de-duping on the pull from LP each time it pulls. This is something we have been needing to do anyways but can be tricky to implement. I will start working on it now though and you have at least everything up to this point now. If you need an update I can try to do it manually
Hey Nick, you have 15 minutes? Something with these postings is making 0 sense and I cannot find where the issue starts\
what was one of the ones we did not have a slack for?
I got access to that email account
I want to see if they are in this mailbox
DL - FL - 439 - 1665 - Tracy Robinson - S1:PL-FacebookMB-MLChowchillaFORMai6 DL - FL - 439 - 1665 - Amorette Becker - S1:PL-FacebookMB-MLFORM240811 DL - FL - 439 - 1665 - Kori Butler - S1:PL-FacebookMB-MLChowchillaFORMai6 DL - FL - 439 - 1665 - Tierra Tittle - S1:PL-FacebookMB-MLChowchillaFORM2009AI2 DL - FL - 439 - 1665 - Rashanda Dawkins - S1:PL-GoogleMB-CRCMP-ChowchillaADG-search
Tracy Robinson has an email from 9am
Hey for the media buyer spend are they going to be entering in the spend individually or as an aggregate. For example, if Maximus spends $100 at 10am and then another $100 at noon does he put $200 in the second entry or each time he just puts $100?
One entry for one day for now. Spend is not updated enough in the platform for a media buyer to accurately say what his spend is hour to hour.
So I got access to that email account
i get the Final emails but not all of the slack related ones
Ya so there is an email that just has the LP ID that seems to be the trigger for the slack info
Hey, you got a sec? having an issue with not all verticals being in bigquery
Did you make the verticals table in the SL BigQuery database?
Which veritic table? Which project?
Ok are you talking about verticals in Leadspedia on SL?
I'm on with James RN making this spend app media buyer proof
Last we spoke was about the verticals
Happy to hop on the call to understand what is being done
Media Spend app works great, I've been using it manually to populate spend data daily. Going to meet witht he media buyer to show how to use and upsert
Ya sure and if you can show me the data that is getting pushed to GCP
Jump on the meeting I just sent and I'll give you a quick ruhn through. James is helping me get through a block but I can show you how it works real quick
Hey Nick, you have 3 minutes? I have a question about how you were mapping and a potential oversight we can implement a best practices on before it turns into something manual
can you meet a lttile later today?
This is something we are trying to move away from
Would it be best to have the LP vertical name in the LR casetypes for more continuity?