@Brian Hirst @Luke Fontaine @Nicholas McFadden, the architecture and flow example we can use with ByteCode today for learning the "standard connector" for training after Nick gets his non-standard connector training.
@Nicholas McFadden has joined the channel
@Nicholas McFadden, please send us invites to Fivetran to ryan@shield-legal.com, brian@shield-legal.com, luke@shield-legal.com please so we can build pipes.
Plus james@shield-legal.com and mike.taluc@bytecode.io to our account as well.
@Nicholas McFadden, please make me Admin on Fivetran and DBT as well so we can both get Alerts and cost, etcs.
@Nicholas McFadden, let me and @here know when we can meet this week to go over this agenda. Probably an hour minimum but we can get straight to access.
ahahh! i thought it was cuz we were just talking about google collab and then i see the architecture next to it lol
Our brains are MUSH after the ByteCode session, time to wind down and let it sink in.... π«
@Quint Underwood
Another new Media Buyer to setup. Also, could we fix a variable in Master_Append too.
Three more work tasks now:
*Thread Reply:* @Quint Underwood, more on the #3 item from a psuedocode language:
IF rawstatusreport.current_status changes from any form of "Signed e-Sign%"
(e.g. Signed e-Sign > Signed e-Sign QA > Signed e-Sign QA WIP > Signed e-Sign FINAL > Signed e-Sign RE-TRIGGERED)
THEN
update finlog table currentstatus to reflect the latest version of the Signed e-Sign status
(e.g. on 1/25/2022 lead id ###### is Signed e-Sign, but on 1/26/2022 it now Signed e-Sign FINAL, update table)
@Quint Underwood, the same is true for Case Reviews
EXCEPT
If statuses NOT a form of Signed e-Sign (meaning it goes to another status like DECLINED or Admin Review or others), then DELETE from table.
*Thread Reply:* @Quint Underwood, good morning. Hope you enjoyed the SB, guessing if you support an NFL team, you Chiefs or your 49ers made it a good game?
How we coming on #1 lambda update?
*Thread Reply:* Morning @Quint Underwood, how we coming on these 3 items, specifically number 1?
*Thread Reply:* Hey @Ryan Haha you got it, had a winning team either way. Still a Chiefs fan at heart. Though I (briefly) knew McCaffrey in college, he played video games in my dorm room a couple times
*Thread Reply:* On the TIP feature enhancement, Iβm setting up a mini βdev environmentβ for testing so I donβt jack up the TIP database or interfere with the production lambdas. I cloned the TIP database already, but I still need to deploy a separate βdevelopmentβ TIP lambda function. Once the logic looks good, Iβll merge back into existing lambda and delete the extra database/lambda.
*Thread Reply:* I canβt promise it before the weekend but I can try
*Thread Reply:* @Quint Underwood, did you beat him in video games, McCaffrey? LOL
*Thread Reply:* @Quint Underwood, how did you progress on the 3 items?
*Thread Reply:* Hey @Ryan still in progress on item #1--saturated with work but trying to get it done ASAP. Item #2 was finished few weeks ago, item #3 in queue
*Thread Reply:* I did not (probably for the best), but my roommate did. Roommate was Joey Alfieri (defensive linebacker), he tried to go pro but didn't progress past 4th string. Joey & Christian were close friends on the team which is how I encountered Christian occasionally freshman year
*Thread Reply:* Never seen an adult get so livid at losing N64 Super Smash Bros...probably why he's such a good athlete. Every fiber in his being is competitive
*Thread Reply:* @Quint Underwood, how we coming on the 3 updates? π
*Thread Reply:* Hey @Ryan see above-- still in progress on item #1--saturated with work but trying to get it done ASAP. Item #2 was finished few weeks ago, item #3 in queue
*Thread Reply:* @Quint Underwood, thank you.
*Thread Reply:* @Quint Underwood, need an update, the deliverable was due a week ago to the TIP team. Please advise.
@Quint Underwood has joined the channel
@Ryan I just added the new media buyer to dailyFinancials lambda. Here's the video demonstrating my workflow.
NOTE: It's also possible to edit the files directly in Github through your web browser (Brian does this for updating the Slack <-> Leadspedia csv file), rather than through terminal.
hey guys here is a table we need to add to the schema
Which of the prebuilt connectors would be good to use as the first trial/example (Facebook Ads, TikTok, etc)? Does anyone have an interest in seeing one over the other first?
@here, more DiCello induced interruptions for the second week for me with yet another custom report being requested by me, so I will push the Admin meeting. π
@here what is our GIthub account info? If we don't have one or should make a separate for BI projects I can create one using the analytics@shield-legal.com email for the account.
Letβs do that @Nicholas McFadden , a new clean account.
ok the password will be: SLnv$7180
Who has access to that email account? I need a confirm code from it
Never mind looks like it is already forwarded to me
@here, morning! I spoke to Andy and the CRM Cooper (recommended to us from Cam's brother) is going to replace the need for our own "Shield Orders" application but obviously will change the schema quite a bit. Good news is Cooper is a Fivetran connector, so I will work with Andy to ensure the Cooper implementation is done in a way that captures our custom fields to process orders.
Copper Pipeline Stages Sales Pipeline β’ Prospect β’ Active Conversation β’ Proposal Sent β’ Contract Sent β’ Contract Booked Order Pipeline:
Copper Custom Fields β’ Advertising Firm (who pays): Copper Company / Leadspedia Advertiser β’ Partner Firm: β’ Other Partner Firms: β’ Case Vertical Name: Drop Down with ID - Vertical Name format β’ Leadspedia Contract IDs: text box with commas (1200,1201,1202) β’ Call Center: (TIP) β’ Sale Type Lead: Contract / Lead β’ Dynamic Pricing : Yes/No β’ Integration Required Yes/No β’ Case Management Software: Drop-down with "other" option? β’ ~Campaign Status: 'Active, Open, Paused, Complete, Archive'~ β’ ~Change Order: True / False~ β’ ~Order Amount Paid To Date: Task Type - Order Payment~ β’ Campaign Start Date, β’ Est. Campaign End Date: (6 months to 12 months out depending upon expect change orders) β’ Campaign Dashboard URL (companies & opportunties) β’ Video Ad URL
@Nicholas McFadden, we got pretty far today on TIP BI Schema, check out the updates. Lots more to go then we can do a review soon. Still need to add io_ tables for system application, plus need to create bi_ transformed tables.
https://dbdiagram.io/d/TIP-BI-Database-Schema-658c7cd789dea62799a2f76f
@Nicholas McFadden, question. What end point is getting you the "Leads" (Contact + Case Type + Intake Template) as we see them in the Inbox in LawRuler? Status Report?
That "big raw API table" that will be the foundation for a transformed "bilrbillableleads" table that is like our tipfin_log table we have today with financial details per billable lead.
Iβm getting the list of leads and their basic info from the βGetInboxItemsβ endpoint and then match it up with each question from each of the intakes returned from the βGetIntakeFormβ endpoint. However they have aβcontact informationβ pretty much for all intakes so Iβm just using those basic contact details instead of from the inbox
Perfect @Nicholas McFadden, thank you for explaining. Can you send us a "raw" CSV sampe of the dataset for "GetInboxItems"
*Thread Reply:* @Nicholas McFadden, are these ALL the fields we can get from GetInboxItems?
And what is UpdateDateTime defined as? The time "any value updated in the record" or "last time status updated?" or something else?
β’ LeadId β’ CaseId β’ DisplayName β’ CreateDate β’ CaseType β’ Status β’ Milestone β’ UpdateDateTime β’ Contact.HomePhone β’ Contact.WorkPhone β’ Contact.CellPhone
*Thread Reply:* @Nicholas McFadden, also is CaseID the FK for CaseType?
*Thread Reply:* @Nicholas McFadden @Luke Fontaine @Brian Hirst, from LawRuler Support. Great that UpdateDateTime equals same thing as "Idle Time" because that becomes the derived e-sign date if we do not have a System.e-Sign.date on the case file / intake template and values. Something we had to handle in the TIP Generate Fin Log python script.
β’ LeadId - Lead Record ID in the system β’ CaseId - Case Record ID in the system β’ DisplayName - The case name field at the top of a case record β’ CreateDate - Date the lead was created β’ CaseType - Type of case of the record β’ Status - Current lead/case status of the record β’ Milestone - if your using the milestone feature the current active milestones on the lead/case β’ UpdateDateTime - Idle time β’ Contact.HomePhone - Home phone for the primary contact β’ Contact.WorkPhone - Business phone for the primary contact β’ Contact.CellPhone - Primary phone for the primary contact
I have updated this google sheet with an example of the current outputs from my requests to the respective LawRuler API endpoints
As I build I will add them there
@Brian Hirst β’ 13 - M - Gross Profit x Day β’ 31 - Combined Details by Year β’ 32 - TC LOSG β’ 33 - TC Consulting Costs @Luke Fontaine β’ 30 - TC Details x Year β’ 24 - SL Totals x Day β’ 14 - M - Ad Platform By Day @Ryan β’ 34 - Combined Call Center Costs
All of us then add visuals White glove formatting β’ VARCHAR left jusitifed β’ DATE & MONEY center justified, 0 decimal points β’ Keep Color Pallette same but Expenses are read / Revenue is Green
@Nicholas McFadden @Mike, any use of UpWork recently for about $1200?
I also wouldn't spend that much without approval π
@Brian Hirst @Luke Fontaine, we have not used UpWork in recent weeks have we?
I am not sure what "UpdateDateTime" refers to but I believe it is linked to status
Yes the CaseID is the FK for CaseType
When I hear back from LawRuler on some other things I will ask them about the UpdateDateTime
@here, for future reference. I had to review this is gross detail for Marc:
Shield Financials Calculations ```Gross Spend = dailyspend + cccost + agency_fee
Gross Revenue = daily_revenue
Gross Spend = dailyspend + cccost + agency_fee
Operation Expense = Fixed Daily Number from Marc
COGS = dailyspend
+ cccost
+ agencyfee
+ mbcommission
+ sales_commission
Total Expense = dailyspend -- COGS & gross spend + cccost -- COGS & gross spend + agencyfee -- COGS + mbcommission -- COGS + salescommission -- COGS + operationexpense
Internal Total Expense = idailyspend + imbcommission + isalescommission + icccost + iagencyfee + ioperationexpense
LOSG Internal Total Expense = idailyrevenue + imbcommission + icccost + iagencyfee
Gross Profit = dailyrevenue - dailyspend - cccost - agencyfee
Net Profit = dailyrevenue
- dailyspend
- cccost
- agencyfee
- mbcommission
- salescommission
- op_expense
Cash Flow = enetprofit (Shield) - itotalexpense (TC)
** Internal or i_ means any client with (TC), DL/TC, DLG/TC or Steve Slater or Shield AZ Law in the Customer Name based on the customer_type table```
@here, with Nick tenative for schema work, Luke out, will push the meeting to next week session to get LawRuler TIP going.
@Nicholas McFadden, in the LawRuler API tables we are getting and building....
Please let us know so we can get the schema mapped out correctly:
Here is the schema so you can see what we are talking about:
```//LawRuler Lead Inbox raw from API // //This is the main table we access all leads (contact + casetype = lrleadid) including status //where we use the LeadID to capture the CaseType and CaseTypeID to get intake question, answers, agent assigned Table rawlrleadinbox { LeadId integer pk CaseId integer //fk DisplayName varchar CreateDate date CaseType varchar Status varchar //fk Milestone varchar UpdateDateTime datetime ContactHomePhone varchar ContactWorkPhone varchar Contact_CellPhone varchar }
//LawRuler Case Types List from API //CaseType Naming convention [casetypevertical] - [caseuse] - [litigating firm] - [marketing firm - [lead source] //AltID Naming Convention [marketing Firm] - [litigating Firm] - [advertiser id] - [contract id] //The m (modified) fields are created from parsing the CaseType Naming Convention and looking up AltID //The io_ fields are input and output from an application for TIP management to update for pricing models Table rawiolrcasetypes { casetypeid integer pk //API field is ID versionrefpst date versionref date casetypename varchar //original field is called Name in API mleadspediaadvertiserid integer [note: 'This is a manual field if not found in intake template field question for "AltID"/custom10043'] mleadspediacontractid integer [note: 'This is a manual field if not found in intake template field question for "AltID"/custom10043'] mcasetypevertical varchar [note: 'Parsed from Name field from LR'] mcaseuse varchar [note: 'Parsed from Name field from LR'] mmarketingfirm varchar [note: 'Parsed from Name field from LR'] mlitigatingfirm varchar [note: 'Parsed from Name field from LR'] mleadsource varchar [note: 'Parsed from Name field from LR'] iobillable boolean [note: 'Updated via Application, default "True"'] iobillablerate float [note: 'Updated via Application, default "122.00"'] ioagentcommissionrate float [note: 'Updated via Application, default "60.00"'] iocalltime_rate float [note: 'Updated via Application, default "2.00"'] }
//raw table from LawRuler to intake answers from questionaires Table rawmlrintakequestionsanswers { ID integer pk CaseTypeId integer //fk LeadID integer //fk Question string Answer string IsMandatory string Type string TypeID integer LawrulerField integer Priority integer IsVisible boolean HeaderID integer Items varchar }```
*Thread Reply:* @Nicholas McFadden, read above and let us know. Hope you feel better.
@Brian Hirst
@Brian Hirst @Nicholas McFadden @Luke Fontaine, so it looks like we need to setup dbt Enviornments as such:
```dbt itself doesn't offer multiple BigQuery connections per project in the strict sense. However, depending on what you mean by "connection," there are two ways to achieve similar functionality:
Connecting to different datasets within the same BigQuery project: BigQuery projects often contain multiple datasets, similar to databases within a traditional database system. dbt lets you specify the target dataset in your models using the schema and database parameters. This allows you to access and transform data from various datasets within the same project without needing an additional "connection."
Managing environments with different permissions:
You can leverage profiles and environments to configure dbt for different environments (dev, test, prod) with varying permissions and project IDs. Each environment can have its own profile in profiles.yml with separate BigQuery credentials and project IDs. This allows you to run dbt in different contexts but doesn't create multiple simultaneous connections within a single project.
Points to remember:
While dbt doesn't directly support multiple connections per project, the two methods mentioned above can achieve similar functionality depending on your needs. Using multiple datasets within a project is generally simpler and recommended for most use cases. Managing environments with different permissions is helpful for isolating data and access across development, testing, and production stages. If you have a specific scenario where you need more than one active connection to BigQuery simultaneously, it's important to clearly define your requirements to receive the most relevant solution.```
@Nicholas McFadden, @Brian Hirst and I setup the first Environment and got dev_tip setup with the json keyfile values per DBT support article. π₯
@Nicholas McFadden , did you update debt password? Mind posting here?
Not sure why it wasn't working but I reset it to the above^
@Brian Hirst, update Dev Logins with that new password...
I updated the sheet
@Nicholas McFadden @Brian Hirst @Luke Fontaine, will setup DBT Homework meeting for Friday....
And setup mike.taluc@bytecode.io with all projects access in GCP and send him DBT login name / password....
@Nicholas McFadden, @Brian Hirst had the best idea. Maybe we just get 3 DBT accounts, one per biz unit? :lol:
That was what I was gonna suggest
Seems kinda impractical to try to do it in one
It would be nice but they are different units
Or could be a branch of the main tip anyways
I'm thinking we should do that, this way we do not have tons of complexity.
I'll make analytics@shield-legal.com for Shield, I'll setup ai_@shield-legal.com for a DBT, and setup an analytics@tortintakeprofessionals.com for TIP and get that all ready with @Brian Hirst and @Luke Fontaine
Once mistake in the latter config, we could really mess something up
Because anything that the others get from tip they want the final not in process version anyways so transformations should always be different or at least post
Sucks to not be able to see it all in one view but itβs okay
We could have all units in Fivetran but then for final transformation and usage you have to go to the respective dbt and big query projects
We can connect all three DBT's to the same fivetran correct?
@Nicholas McFadden FYI, FiveTran has a cool View by βDatabase Connectionβ so itβs pretty simple to save the View:
For Friday before I forget @here
FileVine LeadDocket SmartAdvocate Needles Clio Litify Salesforce
@Luke Fontaine, make Partner Firm and Other Partner Firm both be text boxes and put into the Sales Information section.
@Brian Hirst , thank you. From here forward when we get new ones add to SL db table and into Copper. Iβll add a New Vertical Field too so if itβs a new one.
Whenever Mark notifies the verticals channel I can add into copper
@Brian Hirst @Luke Fontaine, check out Page 18 in Looker Enterprise dash, I made some color conditional score cards. Think I'm going to do these more of these like this on the heavy financial pages / dashboards.
@here, lets skip the today's Admin Meeting, got to get BCL CTM to new ConnextOne dialer
@Nicholas McFadden, can you get the actual Word Docs and PDFs out of LawRuler for a Lead that is Signed e-Sign from the API?
Perfect, that was a direct request from Cameron on the LawRuler pipelines, wanted to make sure that is in your dev-plans. Maybe confirm if that is true first?
@Luke Fontaine @Brian Hirst, I am done my dashboards, please get yours done so we can review them (visuals, column formatting, filters, refresh settings, etc.)....these have to be done by Friday EOD. Please give best effort, it's go time to show Cameron and Marc.
*Thread Reply:* I have some more pressing things with Connex and Vici at the moment. Will work on when done with those.
@here Read this page in DBT, it is the KEY considerations as we built our SQL transforms into business facts tables. https://docs.getdbt.com/docs/build/materializations
*Thread Reply:* This one is important as well on how to create the table update on key records that already exist versus adding new records to a model-formed Table in DBT:
@here, FYI for SL sales CRM to get their data setup to relate to our Leadspedia and LawRuler data.
@James Scott @Nicholas McFadden, you can see whole meeting here for more details but summary below:
β’ Ryan Vaspra starts the meeting, mentioning limited time due to his son's illness. He plans to use Firefly to send a summary in Slack. He updates on development administration tasks, addressing a DBT session and sharing URLs regarding BigQuery model object building. Three DBT accounts are set up, awaiting credit card additions. Access was given to Michael Talek at Bytecode for environment setups. After the DBT session, they will discuss creating views from raw tables, emphasizing the distinction between materialized views and standard views. β’ Ryan elaborates on resources: James supports SQL queries, while Nick focuses on pipelines for Shield Legal and Law Ruler. They will leverage each other's skills in software tools like Copper CRM and Conex One. Looker Enterprise will be rolled out incrementally, with Ryan at Shield Legal granted viewer access for dashboards. Folder renaming and gradual replacement with improved versions are planned, aiming for a seamless transition without noticeable content changes to Cameron. β’ Progress is discussed on financial dashboards, indicating steady advancement in building them.
@Ryan I would like to get with you on piping this conversions dashboard I am making to Looker. Iβll let you know when Iβm ready and have it flowing to BQ
@Nicholas McFadden, good morning. How are you progressing on LawRuler Inbox, Case Types, Users, Statuses pipelines? If your DL projects are truly not letting you spend at least 50% of your time on the internal BI development, let me and @Cameron Rentch know. We need to find you some focus time to get us moving to Leadspedia.
I'm also going to setup next week the PostGres Fivetran pipe for TIP AWS database and SL AWS databases into their corresponding BigQuery dbs. I'm doing this to ensure we have our legacy data in place for the prior months and years to transform in DBT to create the historical data. I'll ensure we name the Datasets/Schemas to be clear it's AWS legacy data.
@Cameron Rentch has joined the channel
@Ryan received. Let me take inventory today and get back with you.
@Nicholas McFadden @Luke Fontaine @Brian Hirst, VICTORY!!!
I have Tony, Brett and the TIP team approving our clean up of useless statuses so our IO Tables for what status and case type is billable, terminal and commissionable. I'm working on it now. π
@Quint Underwood, can you please add this status to the Gen Fin Log to include in the Log Table?
'Secondary Interview - Sent', that and the existing 'Secondary Interview - Final' are both billable I found out.
Is this something you can do over weekend @Quint Underwood?
Hey @Ryan I should be able to add that one today. I don't think there will be any manual backfilling we need to do...I think the lambda should pull all data w/ that new status upon next run
@Quint Underwood, exactly. I'll let TIP Accounting team now they may want to go back and clean up bills.
@Quint Underwood , thank you!
@Luke Fontaine @Quint Underwood, FYI, I had noticed GenFinLog Lambda was running at 7:55 AM but the Status & Call reports were being delivered around 8:01 and 8:11, so I moved them to 7:45 AM so we hopefully catch the 7:55 lambda run.
@here, Nick cannot make today, I'm working on a TV tracking and attribution plan for Cam and Juvenile Hall Abuse, so I will push this to later so we can get all caught up on the latest.
@Nicholas McFadden, did you ever have time to confirm your pipeline for TIP LawRuler update and status? Plus did you confirm if LawRuler API gives us the PDF and DOCS from the casefiles so we can store those in the data warehouse?
@Ryan I have spoken with Cam on prioritization of the tasks. Today I am WFH to focus on code today. I will get the status and case type list tables connected. I am working with Fivetran on an issue for conversions/all leads info. I have not been able to get the docs via API. We had other pressing questions for Law Ruler at the time and as we know they are being difficult. I will pose a question to them and let you know if it is something they can quickly help out with or if it is something they have to add to the scope
*Thread Reply:* @Nicholas McFadden, let me know the latest on pipelines. This week, @Luke Fontaine and I are starting to create the application tables and seed data, plus setting up Fivetran pipes for AWS TIP and SL.
*Thread Reply:* @Nicholas McFadden, can you give us a full status on the LawRuler pipelines below:
@Nicholas McFadden , thank you. Yeah, I had a great conversation with Cam yesterday and just want to help you juggle things so we keep on some progress. Can you add all your projects in the new Tech Board too?
Let me finalize the PMO process I am building and then my projects can get added to your boards from my system
Copper roll out steps: 1.Finish upload sql 2.confirm with andy 3.map out new process and set up training 4.train people 5.build new SQL for old database and Dashboard 6 .Update LP and LDP table with copper names
Jobs: Accounting: must add opportunity/task payment for all payment as they come in
Luke and Brain: until further notice pot in verticals
Mark Maniora: must add leadsPedia contract ID and advertise ID back into copper after contract has been built.
@Nicholas McFadden @James Scott @Brian Hirst @Luke Fontaine @Mike
From discussions, Nick and I have had, we have set up a group development meeting on Mondays. Iβll leave that in place, cancel todayβs short one, but this is one that I need everyone to show up consistently. We need to discuss better for the group, we can do that on Monday.
Looker Dimension to Measure Example in Looker .view files to create aggregation
```dimension: ccashflow { type: number sql: ${TABLE}."ccashflow" ;; }
measure: ccashflowsum { type: sum sql: ${ccash_flow} ;; }```
Looker User Row Level Permissions > User Attributes
```include: "/derived_tables/customers.view" include: "/views/users.view"
explore: customers { #requiredaccessgrants: [sensitivedata] accessfilter: { field: users.state userattribute: allowedstate } join: users { type: leftouter sqlon: ${customers.userid} = ${users.id} ;; relationship: onetoone } sqlalwayswhere: ${totallifetimeorders} > 1 ;; persistfor: "1 hour" }```
Looker Drilldown in View files
dimension: age_tier {
description: "Tiered user ages [0, 15, 26, 36, 51, 66]"
type: tier
tiers: [0,15,26,36,51,66]
sql: ${age};;
style: integer
drill_fields: [gender]
}
Object Linking from Source Dashboard to Destination Dashboard KEY FOR ORDERS TO CASEFILES
dimension: id {
description: "Unique identifier of users"
primary_key: yes
type: number
sql: ${TABLE}.id ;;
link: {
label: "User Detail"
url: "<https://looker.bytecode.io/dashboards/v6DaTZoCWvBCIT8YUAb91K?User+ID={{> value }}"
}
}
@here "lrcasetypes" dataset/table connected and initial synch successful
@Nicholas McFadden, nice did the full "io" columns make it like below?
//LawRuler Case Types List from API
//CaseType Naming convention [case_type_vertical] - [case_use] - [litigating firm] - [marketing firm - [lead source]
//AltID Naming Convention [marketing Firm] - [litigating Firm] - [advertiser id] - [contract id]
//The m_ (modified) fields are created from parsing the CaseType Naming Convention and looking up AltID
//The io_ fields are input and output from an application for TIP management to update for pricing models
Table raw_io_lr_case_types [headercolor: #800080] {
case_type_id integer pk //API field is ID
version_ref_pst date
version_ref date
case_type_name varchar //original field is called Name in API and parsed below in the md fields
m_leadspedia_advertiser_id integer [note: 'This is a manual field if not found in intake template field question for "AltID"/custom10043']
m_leadspedia_contract_id integer [note: 'This is a manual field if not found in intake template field question for "AltID"/custom10043']
m_case_type_vertical varchar [note: 'Parsed from Name field from LR']
m_case_use varchar [note: 'Parsed from Name field from LR']
m_marketing_firm varchar [note: 'Parsed from Name field from LR']
m_litigating_firm varchar [note: 'Parsed from Name field from LR']
m_lead_source varchar [note: 'Parsed from Name field from LR']
io_casetype_billable boolean [note: 'Updated via Application, default "True"']
io_casetype_billable_rate float [note: 'Updated via Application, default "122.00"']
io_agent_commission_rate float [note: 'Updated via Application, default "60.00"']
io_call_time_rate float [note: 'Updated via Application, default "2.00"']
io_call_time_flat_fee float [note: 'Updated via Application, default "2.00"']
}
*Thread Reply:* @Nicholas McFadden,
I see the pipeline for lrcasetypes but we are missing the application (input/output) fields below in your final data schema on the table.
As a solution, we would love to discuss the best way to do that...thinking we create a x-ref table that can join to it, and then be the main table that "TIP Users" use in a UI to update Case Type billable rates for conversions and talk time revenue.
(lr_case_types_id,
io_casetype_billable,
io_casetype_billable_rate,
io_agent_commission_rate,
io_call_time_minute_rate,
io_call_time_flat_fee)
@Nicholas McFadden, if the "m" fields did not make it, not a big deal, we can do the parse in SQL on the CaseTypeName, but we will definitely need the io fields for our financial aggregations.
@Luke Fontaine, can you create a Google Sheet with these columns, and I'll send you February and March 2024 PnLs where you can add the values from these sheets so we have 2 months seed-data to play with on DBT.
β’ TOTAL SALARIES + β’ TOTAL OTHER PERSONNEL EXPENSES + β’ TOTAL OCCUPANCY EXPENSES + β’ TOTAL TRAVEL EXPENSES + β’ TOTAL TECHNOLOGY EXPENSES + β’ TOTAL OFFICE EXPENSE + β’ TOTAL PROFESSIONAL SERVICES = TOTAL OPERATING EXPENSE
@Brian Hirst @Luke Fontaine @Nicholas McFadden @James Scott, i'm on the call now. Been working with Cameron on BCL auto accident stuff and ran a bit over.
No updates, we can rehash at Wed meeting and I can deliver any necessary updates to Cam at my 1-1 later on Wed
@Luke Fontaine, here is the new report.
@Luke Fontaine @Brian Hirst, I put @Nicholas McFadden Connex One Data Dictionary and Data Schema on the Shared Drive for Analytics team: https://drive.google.com/drive/folders/1hTakZvfVrliHIorCv9qpAjbjIvCYjKnj?usp=drive_link
Bookmark this, we'll need it!
@Brian Hirst @Luke Fontaine, important insight from Nick. Note this as well, we'll mess with this in our meeting today.
@Nicholas McFadden, sent this invite for tomorrow, need you at this meeting. Has a single purpose to figure out how to create application tables if pipelined raw data does not have the fields created from your Cloud Functions.
TIP BI Pipeline - LR Case Types for Billables solution
pipe.name AS pipeline_name,
stage.name AS pipeline_stage_name,
opp.id AS opportunity_id,
opp.name AS opportunity_name,
opp.company_name,
users.name AS assignee_name,
opp.converted_value AS target_order_value,
MAX(CASE WHEN field_definition.name = 'AWS Order ID' THEN CAST(custom_field.value AS FLOAT64) END) AS AWS_Order_ID,
MAX(CASE WHEN field_definition.name = 'Amount Paid To Date' THEN custom_field.value END) AS amt_paid_to_date,
MAX(CASE WHEN field_definition.name = 'Campaign Start Date' THEN TIMESTAMP_SECONDS(CAST(custom_field.value AS INT64)) END) AS campaign_start_date,
MAX(CASE WHEN field_definition.name = 'Campaign End Date' THEN TIMESTAMP_SECONDS(CAST(custom_field.value AS INT64)) END) AS campaign_end_date,
MAX(CASE WHEN field_definition.name = 'Approx. Contracts / Leads' THEN custom_field.value END) AS approx_units_sold,
MAX(CASE WHEN field_definition.name = 'Leadspedia Contract IDs (e.g. 1201,1203,1203)' THEN custom_field.value END) AS lp_contract_ids,
MAX(CASE WHEN field_definition.name = 'Case Type / Campaign' THEN SPLIT(field_option.name, ' - ')[OFFSET(0)] END) AS cpr_case_type,
REGEXP_REPLACE(MAX(CASE WHEN field_definition.name = 'Partner Firm' THEN custom_field.value END), '"', '') AS cpr_partner_firm,
REGEXP_REPLACE(MAX(CASE WHEN field_definition.name = 'Other Partner Firms' THEN custom_field.value END), '"', '') AS cp_other_partner_firms,
REGEXP_REPLACE(MAX(CASE WHEN field_definition.name = 'Campaign Code' THEN custom_field.value END), '"', '') AS cpr_case_use_code,
MAX(CASE WHEN field_definition.name = 'Call Center (Contract sales only)' THEN field_option.name END) AS cpr_call_center,
MAX(CASE WHEN field_definition.name = 'Sale Type' THEN field_option.name END) AS cpr_sale_type,
CONCAT(
MAX(CASE WHEN field_definition.name = 'Case Type / Campaign' THEN SPLIT(field_option.name, ' - ')[OFFSET(0)] END),
' - ',
REGEXP_REPLACE(MAX(CASE WHEN field_definition.name = 'Partner Firm' THEN custom_field.value END), '"', ''),
' / ',
REGEXP_REPLACE(MAX(CASE WHEN field_definition.name = 'Other Partner Firms' THEN custom_field.value END), '"', ''),
' ',
REGEXP_REPLACE(MAX(CASE WHEN field_definition.name = 'Campaign Code' THEN custom_field.value END), '"', ''),
' ',
MAX(CASE WHEN field_definition.name = 'Call Center (Contract sales only)' THEN field_option.name END),
' - ',
MAX(CASE WHEN field_definition.name = 'Sale Type' THEN field_option.name END)
) AS cpr_lp_contract_name
FROM shield-legal-bi.copper.opportunity AS opp
JOIN shield-legal-bi.copper.pipeline AS pipe
ON opp.pipeline_id = pipe.id
JOIN shield-legal-bi.copper.pipeline_stage AS stage
ON opp.pipeline_stage_id = stage.id
JOIN shield-legal-bi.copper.users AS users
ON opp.assignee_id = users.id
JOIN shield-legal-bi.copper.opportunity_custom_field AS custom_field
ON opp.id = custom_field.opportunity_id
JOIN shield-legal-bi.copper.custom_field_definition AS field_definition
ON custom_field.custom_field_definition_id = field_definition.id
LEFT JOIN shield-legal-bi.copper.custom_field_definition_option AS field_option
ON CAST(custom_field.value AS STRING) = CAST(field_option.id AS STRING)
WHERE opp.pipeline_id = 1043396 --Orders Pipeline ID value
AND opp.id = 31719647 -- Test Case Opportunity: Illinois Juvenile Hall Abuse - BG/Walsh (TIP) - Contract
GROUP BY
pipe.name,
stage.name,
opp.id,
opp.name,
opp.company_name,
users.name,
opp.converted_value
ORDER BY opportunity_id DESC
@here
After much work, I used our Fivetran pipe for our AWS Legacy database, then learned Fivetran does not pipe views, materlized views or tables without primary keys, so I fixed that in PostGres, then SUCCESSFULLY created all the views we need to generate a BigQuery stored View for matviewfinancialdetails.
πΊ Time! π₯³
@Luke Fontaine @Brian Hirst, FYI, I rescheduled the TIP DBT Config & Validate meeting to tomorrow afternoon.
@Brian Hirst, task is to complete another DBT Intermedia sql file (e.g. table) named lrcasetypes_xref.sql that is a JOIN statement like the one below (you must update the project, source, table names for DBT schema.yml's files) where it creates the "person's view" into which ones need to be "configured" in the future-app, whether or not this SQL table becomes a dashboard and/or app interface.
```--BIGQUERY SQL, must reference your DBT schema.yml sources to work in DBT.
SELECT lct.id, lct.name, rates.fivetranbatch, rates.fivetranindex, rates.fivetransynced, rates.id, rates.iocasetypebillable, rates.iocasetypebillablerate, rates.ioagentcommissionrate, rates.iocalltimeminuterate, rates.iocalltimeflatfee
FROM lrcasetypes.lrcasetypes AS lct -- Alias for lrcasetypes table LEFT JOIN tipprodapplication.iolrcasetypesrevenuerates AS rates ON lct.id = rates.lrcasetypesid -- Using aliases for table and column clarity WHERE lct.fivetransynced = (SELECT MAX(lct.fivetransynced) FROM lrcasetypes.lrcasetypes) ORDER BY lct.id DESC;```
JOIN - show me rows from both tables if a row exists for both LEFT JOIN - show me rows from the left table (lrcasetypes) and all rows (even if blank) from the right table (iolrcasetypesrevenue_rates). RIGHT JOIN - reverse of LEFT JOIN
** Always merge from ryan_dev to staging in GitHUB and run DBT Job.
Updated code:
```graph TB classDef waiting fill:#dddddd,stroke:#aaaaaa classDef review fill:#lavender,stroke:#9370DB classDef final fill:#lightgreen,stroke:#19A000 classDef declined fill:#pink,stroke:#C02942
A["New SIL/LSA/Organic Lead "] --> |Agent moves if lead disqualifies| B("Case Review - Agent Disqualified")
A["New SIL/LSA/Organic Lead "] --> |Agent moves if qualifies| C("Case Review - In Process")
C["Case Review - In Process "] --> |Agent phone transfers to firm| D("Case Review - Phone Transfer </br>(Bill/Commision)")
C["Case Review - In Process "] --> |Agent emails to firm| E("Case Review - Email Transfer </br> (Bill)")
D("Case Review - Phone Transfer </br> (Bill/Commision)") --> |Analyst moves| F["Case Review - Phone Transfer RE-TRIGGERED"]
E("Case Review - Email Transfer </br> (Bill)") --> |Analyst moves| G["Case Review - Email Transfer RE-TRIGGERED"]
F["Case Review - Phone Transfer RE-TRIGGERED"] -->|Agent only moves if notified| J["Case Review - Firm Signed"]
G["Case Review - Email Transfer RE-TRIGGERED"] --> |Agent only moves if notified| L["Case Review - Firm Rejected"]
F["Case Review - Phone Transfer RE-TRIGGERED"] -->|Agent only moves if notified| M["Case Review - Firm Signed"]
G["Case Review - Email Transfer RE-TRIGGERED"] --> |Agent only moves if notified| N["Case Review - Firm Rejected"]```
```graph TB classDef waiting fill:#dddddd,stroke:#aaaaaa classDef review fill:#lavender,stroke:#9370DB classDef final fill:#lightgreen,stroke:#19A000 classDef declined fill:#pink,stroke:#C02942
A["Sent e-Sign "] --> |****Automation Must Move To****| B("Signed e-Sign")
B["Signed e-Sign"] --> |Moves to Processing| C("Signed e-Sign IN PROCESS")
C("Signed e-Sign IN PROCESS") --> |Moves to Vertification| D("Signed e-Signed VERIFY")
D("Signed e-Signed VERIFY") -->|Moves to Quality Assurance| E("Signed e-Sign QA WIP")
E("Signed e-Sign QA WIP") -->|Moves to Flagged| G("Signed e-Sign FLAGGED")
G("Signed e-Sign FLAGGED") -->|Moves If Necessary| H("Signed e-Sign House Flag")
H("Signed e-Sign House Flag") -->|Moves to Approved| I("Signed e-Sign FINAL")
I("Signed e-Sign FINAL") -->|Moves to Retrigger if integration resent| F("Signed e-Sign QA (commission)")
I("Signed e-Sign FINAL") -->|Moves to Retrigger if integration resent| J("Signed e-Sign RE-TRIGGERED (commission)")
I("Signed e-Sign FINAL") -->|Declined by Firm| K("Signed e-Sign - Billable Declined by Firm (commission)")
I("Signed e-Sign FINAL") -->|Declined by Claimaint| L("Signed e-Sign Client Opt Out (commission)")
I("Signed e-Sign FINAL (comission)") -->|Declined| M("Signed & DECLINED (no bill/no commission)")```
@Luke Fontaine loving the drug dashboard !! I think for the summary I think 2 small tables we can add that could add a lot of value is a 1-5 top drug rankings and top grossing which is the percent change column we created love the maps !
*Thread Reply:* Thanks @James Scott! yeah absolutely, I can make those adjustments on the tables
@Brian Hirst @Luke Fontaine, just friendly reminder to always develop in the DBT ryandev branch for TIP. I just merfed your latest SQL in the IDE from Staging to RyanDev so they match.
Will keep in mind, thank you for the reminder
LawRuler SOW Linear Deliverables...aiming to have $4K to $18K left for other features of less priority than these:
@Quint Underwood, got a unique request. Can you send me a code-block of Python that is tested and validated to work to return the dataset from Leadspedia API for the contracts endpoint? It's urgent but will let me setup a GCP BigQuery Python Notebook that I can schedule as a "band aid" to update matviewfinancialdetails to contain a new column for the Contract IDs, which I can then join to our sales CRM table I created for Shield Orders.
for example, the export from Leadspedia Contracts
*Thread Reply:* FYI @Nicholas McFadden @Luke Fontaine @Brian Hirst, for the Copper + Shield AWS Legacy View, MatView_FinDetails JOIN for the Shield Orders in the existing dashboards.
*Thread Reply:* @Ryan I'll try to get something working over lunch for you
*Thread Reply:* @Quint Underwood, perfect, THANK you.
*Thread Reply:* @Quint Underwood , any update on this code
*Thread Reply:* Hey @Ryan still working on this, will try to have it to you by Monday morning
*Thread Reply:* Also FYI @Ryan I'll be away next couple days, should be back Sunday afternoon in case anything comes up
@Luke Fontaine @Brian Hirst, I invited your sl emails to be Developers on DBT, you'll get to setup your own User and then we can pull down Staging branch to your own "BrianDev" and "LukeDev" branches that you can use to learn. Please confirm receipt and that you setup your users. Ty.
I'm in DBT SL and DBT TIP I'll figure out the rest
@Brian Hirst, how you coming on the BigQuery seed data for the iorevrates table? Can you have that done by EOW?
*Thread Reply:* I believe so
@Luke Fontaine, for no more messing with Looker Studio since it seems to have issues this AM, here is TIP e-Signs Pivot Yesterday in Looker Studio. Took about 20 minutes to create and setup: https://shieldlegal.cloud.looker.com/dashboards/43
@Ryan Yeah I just downloaded the SQL from today leads and have been working off of that. Can you give me access permissions
*Thread Reply:* Thank you for sharing this, I want to take a deeper dive into it, I think it's pulling unique counts of S1's but not the frequency
*Thread Reply:* Ah, that's why it is off...yeah, we'll need to maybe SUM instead of count unique.
*Thread Reply:* @Luke Fontaine, fixed that, had to SUM the count, not Unique Count
@Ryan yes it's working now, looks like I just needed permissions
@Quint Underwood, sorry for the ever changing changes to TIPFinLogPY311 but please add these statuses to be processed and please backfill from Jan 1, 2024 for these.
FYI @Nicholas McFadden @deactivateduser @Luke Fontaine, once backfilled, we'll get Brian, Alan and team to reconcile in LR and get billing out for these unbilled items.
β’ Secondary Interview - ASA
β’ HIPPA Signed
@deactivateduser has joined the channel
@Luke Fontaine, please post those 2 new columns we need for our iolrcasetypesrev_rates table please. I will rebuild table and seed data.
*Thread Reply:* revenueratesagentcommissionoverride and categorypointdenomination with a default value of 10
@deactivateduser, and FYI @Nicholas McFadden @Luke Fontaine
I am working on giving you access to AWS, GitHub, Looker Enterprise and the main dashboards we built on Looker DataStudio Google Workspace freemium. And the TIP Google Cloud project / bigquery.
Essentially, here is the 20,000 foot view of our systems: Legacy system is AWS LightSail Postgres with Python Lamdas that run each morning manually by me, Luke and Cameron. We have three stages, listed by first "their function then | their names.
Shield Legal: Morning Financials App: https://us-east-1.console.aws.amazon.com/lambda/home?region=us-east-1#/applications/dailyfinancials-rebuild-py311
β’ PLEASE DO NOT RUN THESE AS THEY ARE NOT STATE INTELLIGENT AND WILL SCREW UP PRODUCTION DATA, CODE is in GitHub β’ Calculate Performance runs first β’ Apply Adjustments runs second (it reduces Leadspedia order budgets without regard of duplication) β’ Master Append models the financial table and appends rows We'll plan a session for you to watch Luke and I do the manual routine for TIP, which is the psuedocode workflow for what the incomplete Calc_Perf TIP version.
@deactivateduser @Nicholas McFadden @Luke Fontaine
SL Morning Financials Routines / Functions App a. https://us-east-1.console.aws.amazon.com/lambda/home?region=us-east-1#/applications/dailyfinancials-rebuild-py311 Calculate Performance - Production Lambda: β’ https://us-east-1.console.aws.amazon.com/lambda/home?region=us-east-1#/functions/dailyfinancials-rebuild-p-CalculatePerformancePy31-HMx37JchlIZ8?tab=code β¦ Get LP Sold Leads from Yesterday at 3:00 AM EST through Today at 2:59:59 AM EST to reflect one 24-hour period of Sold Leads β¦ GET TypeForm via Zapier to pull in Media Spend by day, by campaign / vertical / case type and allocates media spend to sold leads to create DONOTTOUCH staging table β¦ Humans get TikTok spend, clone rows, pull and pivot LP sold leads by TT leads, manuall ad rows to DONTOUCH with SL revenue, allocated spend against leads, and signed contracts
@Quint Underwood, meet @deactivateduser, our new AWS Python Data Engineer and he needs 30 minutes with you, me and @Nicholas McFadden to show him a few things:
*Thread Reply:* @deactivateduser, @Quint Underwood confirmed he'll be on the call tomorrow so he can get you into the code.
@here , with Mike slammed on FB Trusted Forms, 2 key integrations, Looker SL dash, AI dataset finalization and TIP dialer time back today. Weβll meet next Monday and get us all back into sync.
@deactivateduser @Nicholas McFadden @Luke Fontaine
Firefli.ai app did not send me my meeting notes but here is updated Workflow Diagram for SL (1st image) and TIP Workflow and Data flow diagram (2nd image)
@deactivateduser @Nicholas McFadden and here is the systems architecture diagram. Red boxes represent the AWS PostGres system, the others are for the new system for both TIP and Shield Legal:
Thanks, those should be very helpful! I'll be sure to record my next meeting with you guys because that will probably be a useful reference that'll avoid me having to reach out over remembering minor steps in the processes.
Yeah, I will work closely on a code review of both existing CalcPerf PROD and his code for CalcPerf_TIP version so you will have my total support during this...
Awesome, thanks Ryan!
@Brian Hirst @Luke Fontaine FYI @Nicholas McFadden
DBT for TIP Analytics build has been updated with the new /Intermediate/https://console.cloud.google.com/bigquery?orgonly=true&project=tort-intake-professionals&supportedpurview=project&ws=!1m5!1m4!4m3!1stort-intake-professionals!2stipstagingdashboard!3slrcasetypesxref|lrcasetypesxref.sql model including all new columns we added for TIP Payroll at the case type level. The Staging Environment has been merged and for safekeeping Main branch has been updated but not BUILT production yet. (link goes to BigQuery > Views)
Logging out of that account in DBT and moving on to SL DBT build debugging. @Nicholas McFadden, I'll create these app tables too since we diagramed it and it goes with your idea to parallel develop:
Table io_ad_platforms {
platform_id integer [primary key]
platform_name varchar [note: 'Facebook,TikTok,FacebookIFLA,SnapChat,Google,Parler,Outbrain,Bing,Reddit,Yahoo,Taboola,LinkedIn,DV360,TV']
created_date timestamp
modified_date timestamp
Table io_media_buyers {
media_buyer_id integer [primary key]
full_name varchar
mb_initials varchar
base_compensation float
overhead float [Note: '$10,0000']
commission_percent float [Note: '5%']
revenue_commission_amt float [Note: '5000 at 1M, 10000 at 2M, 15000 at 3M']
created_on date
//Input form by Media Buyers spend, platforms and verticals
Table io_media_spend_form {
media_spend_log_id integer [primary key]
media_buyer_id integer //FK
spend_date date
platform_id integer // FK
ldp_vertical_id integer //FK
spend_amount float
create_date timestamp
modified_dat
@Ryan finished the April 2024 commission adjustment in bigquery
@deactivateduser @Nicholas McFadden, yesterdays SL Calculate Performance AI Notes: https://app.fireflies.ai/view/SL-Calculate-Perf::XEl76gq7P4Z17jf2
FYI @Ryan and @Nicholas McFadden, moving here for pipeline request follow up. Thanks Nick!
FYI @Luke Fontaine, I'll explain but keep working on the CONCAT fin details match to copperxref and copperlist in BQ.
@deactivateduser, meeting notes for SL Lambdas: https://app.fireflies.ai/view/SL-Lambdas::8AD1c43DZNaMnJMJ
Hey @deactivateduser & @Ryan, how's the onboarding coming thus far? Lmk if there are any specific pain points or questions. Hopefully (π€) my availability will be a little better in next 1-2 months and can help get services transitioned over to Shane
Thanks Quint! Things are going well so far but I'll be sure to reach out if I'm hitting any roadblocks.
Working Looker Table Calc @Luke Fontaine
case(when(${Financial_Details.rev_month}=date(2024,01,01),16.28), 0)
@Luke Fontaine
e_daily_spend + e_agency_fee + e_cc_cost + e_sales_commission + e_mb_commission + flatirons_daily_overhead = TOTAL EXPENSE
Net Profit:
e_daily_revenue - e_daily_spend - e_agency_fee - e_cc_cost - e_sales_commission - e_mb_commission - flatirons_daily_overhead
@here, pretty cool site if you have never used it before:
@Nicholas McFadden @Brian Hirst @Luke Fontaine
I am setting up a meeting with Cameron and us, he wants to have new client dashboards built but of course the data he wants is not in a data warehouse. So tomorrow, we'll let him tell us what he wants but in our SMS together, read below for details:
*Thread Reply:* @Nicholas McFadden, he does not understand that his desire to have their own ability to pull lead reports, criteria Q&A, retainer documents from this client dashboard is totally dependant upon the LR Advanced API, so I continued the conversation pushing back on him and thus the reason I sent the email to Branden Brooks. We need to stand unified tomorrow, building email ingestion of these files into GCS hopefully isn't a "temporary solution" that wastes our time when building the data pipes for Inbox, Intake, Documents will empower SO much more development.
*Thread Reply:* If we have trouble reconciling and sending them ourselves, why does he think giving them access to pull will change anything? They will have zero context to all the craziness in LR or what it is taken for you to establish the naming conventions, etc. that we still have trouble enforcing. Them seeing raw data is going to cause a million fire drills and we will get no where
*Thread Reply:* So I agree with you
@Luke Fontaine, just with the projects stacking up, here is priority of the things you are working on....
@Luke Fontaine, FYI., hold off on the Flatirons by month, day and year dashboards in Looker Enterprise, I found a few oversights in Shane K.'s updated LookML he is fixing now. When he is done we'll have all internal, external and combined measures for the new viewfindetails that will have Flatirons Overhead and Flatirons Net Profit (for internal / external / combined).
When he is done, I will create Page 41 to completion. You can get Flatirons done using the external metrics.
*Thread Reply:* Ok sounds good, lmk when ready and I'll resume finish the dashboards
@James Scott I canceled our AI meeting tomorrow, there are a couple of items I'm going to do more exploratory analysis in the data set, would you be available Monday after tech sprint for Q&A? fyi @Ryan
@Nicholas McFadden, the data looks great in SL GCP BigQuery for Leadspedia All Contracts. Thank you!
@Luke Fontaine, per our Architecture Diagram, I updated DBT for leadspedia model and schema.yml files. I put in /Source the View model for rawldpcontract that you will use as we architected.
The notes below are not guaranteed the exact names, just use as pseudocode.
Columns & Tables to Join β’ bicopperorderslist (all fields except LeadspediaContract IDs and give fivetran columns table specific alias (we'll want to see latest synced in copper.opportunity and leadspediaallcontracts) β’ bicopperleadspediaxref (only Contract ID in SELECT) β’ rawldpcontracts (all fields, make overlapping ones (e.g. fivetran) table specific aliases β’ shield-legal-bi.awslegacyviews.matviewfinancialdetails (all columns Your JOINS (and this should be validated):
β’ bicopperorderslist.oppid (one to one) to bicopperleadspediaxref.oppid β’ bicopperleadspediaxref.contractID (one to many) rawldpcontracts.contractID β’ rawldpcontracts.contractNAME (one to one) on matviewfinancialdetails CONCAT of campaign " - " customer " - " saletype And we should see ALL rows in matviewfinancialdetails, no matter if rows are missing in bicopperorderslist.
Do all of this in DBT even though you can develop in BigQuery. Be sure to use the TeamDev branch or create your own and pull from TeamDev, it has latest working Staging DBT Build. Do not push to staging.
@Luke Fontaine we need this done by end of day Wednesday in BigQuery. We can move to DBT together once we know the query to build this bundled View (to say the least) works. I put this into Monday for you. The table name should be like bishieldorder_financials
FYI @Cameron Rentch, this is THE step to create the "Paid Not Spend" / Shield Orders Fulfillment dashboards.
*Thread Reply:* @Luke Fontaine FYI, from BigQuery for super reference:
@Nicholas McFadden , what is your status or plan on these LawRuler Advanced API pipelines? These are increasing in urgency and priority. Please let us know:
lrleadsinbox lrleadsintakes lrleadsquestions (not sure if that belongs to lrleadsintakes)
@Luke Fontaine, how are you coming on the DBT Build of the shieldordersfinancials model?
*Thread Reply:* @Ryan I'm still running into the roadblock of the build in dbt, not sure why not it's having difficulties with certain columns, I've tried a few different tactics, but they haven't been successful yet. Looking into alternative options to get it up and running again
*Thread Reply:* Ok, keep urgency on this. FYI @James Scott, this DBT build for Shield financials, Shield client dashboards and now a new custom set of "Sales Rep Pages" our investor wants are @Luke Fontaineβs top priority. He's stuck on DBT SQL columnar references, so lets give him time and I'll move the AI dashboard to tomorrow for progress review.
@James Scott, how you coming on the PDF Extraction effort after yesterday convo?
@Ryan great news, fixed the error, model has been built successfully and appearing in bigquery
Can you give me the Copper Opportunity ID with two LP contracts (e.g. tiktok and non-tiktok) example that you used or are using for the financial validation on an order? One with a Payment from Brian Thiessen in Copper especially. Send me that information, I'll revalidate your validation and then I can have the new Paid Not Spend dashboard @Luke Fontaine
*Thread Reply:* yeah no problem, I'll get that over shortly
*Thread Reply:* @Ryan just validated 1455, and 1456 looks correct
*Thread Reply:* What is the Copper Opportunity ID for those 2 contracts in LP? And can you get a Looker Enterprise page ready with the data in BigQuery in Looker? Remember, you'll have to "Add View From Table" in Looker logged in as your Admin Dev user first before it will appear in the Explore to create the Dashboard @Luke Fontaine
*Thread Reply:* Opportunity ID:31450160 Got it, on it
*Thread Reply:* @Luke Fontaine, got the above, thank you. When can you have the dashboard page ready?
*Thread Reply:* @Luke Fontaine, good morning. Great work, almost finished. See below for what I need after you get done with daily financials:
... If you do in Looker, you can create a way for an order to be expanded / drilled into for daily financial details so that might be best.
... Either method, be sure we "sum" on the dailyfinancials so that our "Revenue Spent" and "Cost Spent" sum up according to the days listed in revdate.
... You can show MIN (revdate) and MAX (revdate) in this new SQL to ensure we can have the "Media Start Date" and "Media Last Date" like in Paid Vs Spent.
... If your Looker Enterprise Dashboard works with one row per Copper Order but can drill-down into daily financials, that would be AMAZING. But a dashboard with just "One Row Per Copper Order" and the measures SUMMED and would be ACCEPTABLE.
SELECT **
FROM `shield-legal-bi.sl_staging_dashboard.bi_shield_order_financials`
WHERE ol_opp_id = 31450160
ORDER BY rev_date DESC;
*Thread Reply:* @Luke Fontaine, the columns I was referring too...
@Luke Fontaine, let us know when you have time to discuss your client dashboarding projects for Shield, these are URGENT as @Nicholas McFadden is working on the secondary data from LawRuler and Brian is working on the Shield Portal. We need this client dashboard draft done in Looker Enterprise by Thursday.
@Ryan the client campaign lookerstudio dashboards are the ones that were included in my contract, all others were done as a courtesy to you.
@Luke Fontaine, in your Shield Copper Orders SQL which field in your DBT SQL model is calculating the critically important field we called "Paid Revenue Left To Spend" which is the Total Amount Paid To Date minus the Shield Revenue from matviewfindetails?
@Nicholas McFadden @Ahsan
TIP Database Lambda for creating the public.financial_log table run at 4AM / 8AM / 12PM / 3PM PT:
TIP-Py311-GenerateFinancialLogPy311-9YN38SeLubUn
Lambda Update Requirements:
β’ Today, PROD code does this: β¦ Processes all call and statusreport data into finlog table where LRLeadID MUST be unique table β¦ Looks up the Leadspedia LeadIDs from the Case Description field (if blank, it will not find the lead in LP and leave S1 blanks), if matching in Leadspedia, it pulls the Contract ID and the Contract Name for that leadID. β¦ Process status based on these filters: βͺοΈ Signed e-Sign% βͺοΈ Case Review% βͺοΈ Secondary Interview - FINAL - hardcoded βͺοΈ Secondary Interview - Sent - hardcoded β’ Update code to process the new Secondary Interview Status β¦ Secondary Interview - ASA β’ Run lookup on TypeofCase, existing day or prior day confirmation of MB's S1, and then fair allocation of Referrals by existing MB S1. β¦ If no media buyer can be identified, list "UNKNOWN" in S1 value with LPLeadID as "Referral" β¦ Update lpcontractID and lpcontractname based upon TIP's TypeOfCase value but using same day, prior days validation of the ID.
@Brian Hirst, the iobillablerateeffectivedatetime field is ready in the tort-intake-professionals.tipprodapplication.iolrcasetypesrevenue_rates for retool UI.
populates existing value if present and writes from app to Bigquery
She can set her September ones to 9/4/2024 etc. I already did their prices
@Brian Hirst, notes from new TIP Billable Rate pricing review meeting with Tony, Brian T., Cam and me. It will change our TIP App and table a bit. Nothing crazy:
β’ basecostrate FLOAT β’ qacostrate FLOAT β’ screeningcostrate FLOAT β’ tipprofitmargin_percent INTEGER
@Brian Hirst, can you update the TIP Ops Retool App to show the "Current Effective Date and Time" where I have the red box?
*Thread Reply:* @Brian Hirst, did we fix the error message?
*Thread Reply:* Not yet. I need to email Shane. I can't find anything
*Thread Reply:* @Brian Hirst, can you go ahead and make the Current Effective Date Time field "read only" then while you get that done? Please get that done ASAP, we'll start to go DEEP into DBT SQL and BigQuery Data Engineering this week. Get this small items complete so we don't lose our place or progress.
*Thread Reply:* Iβm working on that as well. Note: the current effective date WILL NOT alter the table. It only can read whatβs in the table
*Thread Reply:* Understood, but our Users will get confused. Tony, Malissa, etc, etc, they should just see "read only" field. Thanks @Brian Hirst!
@Ryan added. I removed it populating from "Billable Effective Date Time Box." "Billable Effective Date Time Box" just writes into BigQuery now. Similar to "Current Billable Rate" and "New Billable Rate"
@Brian Hirst, make the "Current" field "ready only" like we did with the Current billable rate
@Ryan That field won't edit anything in Bigquery, only display what's currently in BigQuery. Note: "Billable Effective Date Time" needs to have a value in it for the app to work.
Correct, but make it a read-only like "Current" so that people / users, do not get confused
And you should make sure the "error" for trying to save a "record" without the date doesn't pop up, it should say "Please Update Effective Date To Save Record" .
*Thread Reply:* @Brian Hirst, focus on these two items, have them complete before we meet on Monday to start more dev work on rev rates and lr statuses.
*Thread Reply:* @Ryan I got the date part figured out. Error message is giving me some issues but I'm going at it a different way now. I am going to make it so the button is not able to be used unless every field is filled.
Iβll need to do some research on editing the error message.
@Nicholas McFadden, how you coming on lr_inbox? FYI @Dustin Surwill @Brian Hirst
@Dustin Surwill has joined the channel
*Thread Reply:* @deleted-U0410U6Q8J3 FYI
*Thread Reply:* Dustin will need to change it in the container. I will ask him
*Thread Reply:* Try the lrinboxview. The lr_inbox table is not the final table you should pull from, we are still working on that.
*Thread Reply:* Dustin, Iβll wait until that table is ready.
*Thread Reply:* @deleted-U0410U6Q8J3 , ill remove the code, and work on LR_statuses instead until theyβre ready
*Thread Reply:* @Dustin Surwill, what's the latest on LR Leads Inbox pipeline dev?
*Thread Reply:* In the testing phase almost done
@James Turner, when you get in today. Have @Zekarias Haile get access to your Miro, I can have him "add to the Shield front end" to document the "Legacy AWS Lambda" as a section for our legacy system. (like a new, linear section below the new architecture flow?)
Just sent that over to Zek!
@Nicholas McFadden @Nick Ward @James Scott, on the call. Got a short one today so we can get into smaller groups to finish up our resource plan.
@Dustin Surwill, good morning. What's latest on below:
*Thread Reply:* FYI @Brian Hirst @Nicholas McFadden
*Thread Reply:* 1. Is currently manual python but once backfill is complete then it should be GCP Function. 3931/602952 - 0.65%. Currently looking into making the code concurrent and a way to pull the backfill faster
*Thread Reply:* @Dustin Surwill , great. Iβll take a look at #2 and advise. Let me know more on 1 and 3 next week.
*Thread Reply:* @Ryan I am working on the leadspedia all leads
*Thread Reply:* @Nicholas McFadden, excellent. Boom, LFG!
*Thread Reply:* @Ryan talked with Leadspedia and you have to paginate by 30 day increments, 1000 records at a time, and only 10 calls/sec. Building out the code to accommodate these requirements. Setting up a meeting to discuss increasing these limitations
*Thread Reply:* @Nicholas McFadden, ok. Let me know if you hit any hurdles. Thanks for update.
@Dustin Surwill, Can you send me similar SQL like you did for LR_Inbox but all I need are leads that are LIKE current_status (if field name) '%Signed e-Sign%' and LIKE type_of_case '% - Shield Legal%'
The only fields I need selected are in red boxes. Let me know ETA on this, it's going to help me create "real time Media Buyer e-sign dashboards" as a "fail-safe" to TIP Slack Alerts Lambda/Zapier failures.
FYI @James Turner
@Brian Hirst, you thinking what I am thinking? DBT Model from TIP into SL, into MB Dashboards, updates every 1 to 5 minutes, then we'll have a table for @Nicholas McFadden to help create PubSub Message Service to Cloud Function a Slack integration to replace Lambda / Zapier / Email. π
*Thread Reply:* @Dustin Surwill, got an ETA On this? Ty!
FYI @James Turner @Brian Hirst @Nicholas McFadden
*Thread Reply:* I do not, I am dealing with integration issues.
Fewer moving parts definitely appeals to me
@Ryan lets set a time to discuss next week
@Dustin Surwill, is there a Monday Form to submit Integeations Requests for Ben Crump's Gaming Addiction (you did this one already yeah?) and Depo Provera?
*Thread Reply:* We do not have a Monday form. James's team clicks a button on their board to send us the campaign when they are done (whether it needs an integration or not), the I look through Slack for a message saying that the campaign needs an integration before telling the team to work on it.
*Thread Reply:* Ok, can you add Depo Provera to the queue plesae when free later today?
*Thread Reply:* Depo Provera to BCL? We would need their UDFs and we have not reached out for those. We were only told to integrate with Napoli
*Thread Reply:* After we resolved todayβs issues, reach out to Anthony for the UDFs for the Ben Crump and partner firm
*Thread Reply:* @Dustin Surwill, getting UDFs confirmed for you.
@here, for everyone's knowledge, here is the SL & TIP Stack recommendation we got last year with our Google funded onboarding and training engagement. Everyone has the backstory now.
This is cool- thanks for the backstory on this!
@James Turner @Brian Hirst
Select ALTID , --(custom field in Form Field tab in ALL LR Intake Templates) ContactFirstName, ContactLast_Name, S1, --(custom field in the Form Fields tab in ALL LR Intake Templates)
from 33 tables and stuff
where --THE FILTERS WILL TELL YOU WHEN THE SYSTEM SENDS THE E-SIGN UserName = 'System' AND current status = 'Signed e-Sign' AND prior_status = 'Sent e-Sign'
OR
--THE FILTERS SHOULD CHECK WHEN A USER SENDS THE E-SIGN UserName != 'System' AND current status = 'Signed e-Sign' AND e-signature PDF received -- << meaning the retainer signature data signal
As a Media Buyer User, I want a page in my MB Dashboard that replaces the need for TIP Slack Alerts to notify me in real time of a Signed e-sign.
As a Media Strategy VP internal at Shield (Cameron), I want those same slack alerts in a dashboard by day by vertical by media buyer AND have a copy of that slack alert put into the corresponding slack channels.
How:
Steps: TABLE CREATION
@Nicholas McFadden, this is ALL of the requirements for the new GCP BigQuery DBT Pubsub service to communicate real-timish e-signs to Cam and MBs.
(US) +1 401-552-4192β¬ PIN: βͺ972 639 452β¬#
Ryan said he had one more thing, no screen sharing required
Bytecode IO has been disconnected because all members from that organization have left this channel.
Data Update - Updates have been made to the Looker Enterprise "8 - All Law Ruler Leads With Enrichment" dataset, on the "Lawruler Lead Create Date Time" Data point for the purpose of fixing an existing bug.
Testing has been done to mitigate issues but please notify me if you experience any issues with this change. Thank you!
@Ryan Monthly Snapshot System for basic + TIP financial info on leads has been developed and is currently running. Attached is documentation.