Looks like Excel Power Queries can connect to a PostgreSQL databases. I would just need to install a PostgreSQL ODBC connector, like you would have to with connecting to SQL Server using Excel. In the mean time I can use BIG query as we discussed, but for automation I would like to just pull the data directly into Excel via a power query. I can handle installing the PostgreSQL ODBC connector (just have to get our admin approval). I would need a few things from you, such as: 1) Whether or not a VPN is used and the credentials for that (if needed). 2) Server name: 3) Database name 4) Username / Password.
I am writing some report automations for a few of our Monday Reports. To prevent rewriting SQL code, do you two know where to find the SQL code for the query that is generated when Law Ruler CRM hits their back-end database which processes the download for a report from Law Ruler using selections made in the Law Ruler GUI to select parameters for a report? Surely, a SELECT Query gets built somewhere before the form is submitted. I would just need to know the location, so I can build a Power Query for Excel to process for my automation datasource.
We do not have access to their system like that. I have a Python script that will a custom report with all PII and questions from the lr-data database that I have made
It seems I might have to write the SQL SELECT statements to query directly from BigQuery, which is what I thought. I was just wondering if we already knew what tables and fields were being queries based on the selections from the Law Ruler GUI.
Ok. I am sure (if needed) we can run your Python Script through ChatGPT to convert into a SELECT statement so I can setup Power Queries in my Excel automation. I am just trying to automate what can be automated to reduce the redundancy of some manual tasks.
They are 2 completely different systems. The database (Postgres) and BigQuery is data that I have a function running every hour to keep up to date from the LR API. The LR GUI is controlled from the company LR.
Ok. So, it seems like I will have to write SQL queries to query the BigQuery side of things, selecting all the tables/fields for my reports to automate into Excel using Power Queries. Not sure the Python code you have would be of any assistance in this area. Although through ChatGPT it may be useful to create SELECT queries depending on parameter selections.
Here is my script. It uses 2 SQL queries and stiches them together by lead_id.
I'm working on automating a process using Python. My goal is to connect to BigQuery, run a SQL query, and export the results directly to an Excel file. I’m aware that I can currently export this data using Law Ruler layouts, but I’m looking to eliminate that manual step by incorporating everything into my Python script—similar to what I’ve already accomplished using Excel Power Queries (which rely on the BigQuery connector). However, when setting up the Python connection, it's prompting me for a service account key file. When I attempted to view or access this file, I received a message stating that I don't have the necessary permissions. Could you assist me with obtaining the appropriate access or providing the correct service account key file?
Install the gcloud cli. I can help a few minutes after I get back from lunch so in about 20 minutes
gcloud auth application-default login
Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Scope CurrentUser
Packages needed for postgres and bigquery:
cloud-sql-python-connector[pg8000]==1.17.0
google-cloud-bigquery==3.25.0
Hey Dustin, I am doing the Orphan Client (formally unsigned clients) List for Brittany. She told me you had a query to return a list of LeadID, Status, Type of Case and e-Signed dates. Could you please let me know where that query is and if I have access to it, thanks.
Here is the Postgres query that will get you that information. The tables are also linked in BigQuery under the lr-data dataset if you wish to convert the query:
l.id as leadid, ct.name as casetype, ls.name as Status,
(SELECT MAX(date) FROM lead_history_status WHERE leadid=l.id AND tostatus = 'Signed e-Sign' GROUP BY leadid) AS esigndate
FROM lead l
LEFT JOIN lead_status ls ON l.statusid = ls.id
LEFT JOIN case_type ct ON l.casetypeid = ct.i
Thank you. I just didn't want to rewrite something if you already had it working. Thanks!
I didnt. I butchered another query for it
Besides my Google Drive, do you know if Tort Intake Professionals has a file share/server to where I can store some very small amounts of data accessible when my PC is off and available to anyone I , or someone else can give permissions to in order to read data?
I created encrypted tokens for the multi-user automated audit application I am writing in Python. As part of that process I register users and give them permission to use the application. It stores the user data on encrypted security tokens that authenticate the user to the app at launch and not just someone with a Google account. As of right now I have the encryption token data on my Google Drive, but if something happened to me or my account then users can not authenticate to the app. We could set up a service account for our department and go that way, but authentication like this should not live on someone's account (assigned to 1 person), as I said, in case I quit, get fired, etc...Hope this makes sense.
Not sure if you the right person to ask about this or not. You're one of the only other very technical people I have been exposed to. So, I figured I would ask you.
Sound like a case for converting it to the website I built: shieldlegal.dev
It might be able to be converted at some point, but at this point it's strictly designed to run on desktop. It relies heavily on the Windows OS. I can use the Google Drive for the token storage, but either a service account with Google Drive or a true file server would work great as well.
why does it need a account system if its running on the desktop?
What functions of the OS are you using that you think cannot be converted?
Because I need to see who has what version installed and it needs to update itself based on the version the user is running. It also needs to authenticate role based privileges and give access to parts of the software. Everything is coded. I would just like a place to hose my user tokens other than my google drive.
Why not authenticate against google cloud IAM?
This security token I generate hold more information than just an encrypted authentication. They hold information about who the user is, the software version they are running, etc...
So, it's acting as a configuration file for the software as well. Everything is encrypted against prying eyes.
It all works nicely as is and will help me administer the software.
Just look Google Big Query to enrich my audit data and GMail SMTP for sending error reports.
why not rely on the gcloud CLI default application credentials that everyone has to setup and login on their own system?
The token files I created are dual purpose. It hold encrypted access control as well as application version information. What I did is working effectively. I was just asking about a place to store the token files. I can house it on my Google Drive for now and give only the users access if we don't have a file share.
Why not have it store local to each user?
I will have the repo setup soon. I just got access to GitHub through James
Can I get either access to or a copy of the source code for your application?
Its on a GitHub repository under Tort Intake Professionals account. Although, I have to update the version that is posted there.
Yes, that was my next step. I had to do extensive testing on the IU and back-end first before I made it final.
github is version control therefore you should be uploading checkpoints for every group of changes (every day or couple of days)
github is version control therefore you should be uploading checkpoints for every group of changes (every day or couple of days)
Correct. That part is not fully setup yet to integrate with Visual Studio. It was my next step.
there is a command line tool you should use until you have that integration
I will address that very soon. Our immediate goal is for deployment on Monday and had to ensure I had a fully working and tested application before deployment. As mentioned, full integration to Github with version control is my next priority.
Those 4 campaigns are secondaries which means they do not normally have a signed document
This is data from a few days ago from the same layout, showing the same case types had "E-Sign Signed Date" values, with no NULLs, with the exception of test records.
Those case types are different, the picture you just sent uses the intake case types but the first image 20 minutes ago has the secondary case types. SEC in the case type marks it as secondary
Thank you for some insight. I will reach out to you if I have further questions. Just trying to understand why I was getting two different results using the same layout. Maybe it is a Case Type Selection issue, from what you are saying. I though they may have changed the layout. That's why I reached out to you.
We had issues with people deleting this report "MI Intake Files-Daily Run Automation" which is sent to mclark@tortintakeprofessionals.com, pjerome@tortintakeprofessionals.com, brittany@tortintakeprofessionals.com" on a daily basis. Can you let me know if someone deleted this report again, who deleted it, when and who recreated it and when that was? I know I created one after the first delete (when Olivia accidentally deleted it) and I got good data, but a few days ago it may have been deleted again. I am wondering if someone was selecting the correct campaigns and maybe that is why we were not getting the "E-Sign Signed Date".
Thank you! That's what I suspect based on what I see and what you told me about the two case types. This concurs with findings and suspicions. Again, thank you for your help!