For the past few months, I have built what I call a universal connector for Knack. The software allows you to use a wizard to setup a connection between your database or CSV/Excel file to upload and download data from Knack using the API. The wizard automatically selects the database, table and fields in Knack. All you have to do is a one time mapping of the fields in the wizard to set them up. Afterward, you can schedule out the connection to run on a schedule. It also has connectors to Grok and ChatGPT as well.
I built it because I found using third party tools good but limited in many cases (The third party tool would not allow specific things). I also got sick of having to build APIs for every connection I needed to build. So I built a universal connector that allows you to connect any database or file to Knack and send/receive data from your Knack database without having to build an API for each instance.
The reason I am mentioning this is to see if anyone would have any interest in such a solution. I put a tremendous amount of time in building it and I use it weekly to build and manage my own connections and practically can’t live without it now. However, I don’t know if anyone else is having the same challenges or if anyone thinks a connector like this would be of any use for you.
The goal was to build a tool to setup connections to Knack quickly and easily and allow my employees who don’t have API knowledge to setup connections. I attached one screenshot of the tool so you can see how it looks.
I am just curious to know if anyone would find this useful.
I’m sure there are others that would interested. I’m not sure I have an application for it at this time, but like @StephenChapman I’d be interested in seeing a demo and finding out more.
I have an application for this - I need to refresh a number of tables in Knack, from a SQL database, on a weekly basis. Right now we run a manual CSV extract and upload, but that is tedious. How are you pushing data into Knack - I assume you are calling a Knack API to do this? And I assume this runs on a PC or a cloud service that has an ODBC connection to the database, or something similar?
To answer Leigh’s question - the interface can pull the data from the SQL database directly and then it will pull the data and send it over to Knack using the Knack API setup on your account. The application maps the SQL fields to the Knack fields and transfers the data over. This can even be run every few minutes if you want. Otherwise, it can be scheduled to run every week. The application will replace the manual process and will pull the data from SQL and upload to Knack through the API. I am pulling data from several SQL Server databases right now and posting to Knack on a regular basis. In terms of access to the SQL Server, the app will build the connector to the SQL Server native in the app via ODBC or whatever connection method you prefer. The app can be run locally on the network or a port can be open to the SQL Server and limited to a specific Ip for security. You can also use a VPN connection if you want the app to run from the cloud. Many options to accomplish this.
If anyone wants a demo, just shoot me an email at sgm@scan-logic.com and I can show you the connector and what it does. Thank you for all the interest!
Erik, can you send me an email at sgm@scan-logic.com and I will get a demo setup for you. Or you can shoot me a message and send me your email address. Thanks.
Just an FYI - I updated the project to include connectors for Microsoft SQL Server, Excel, ChatGPT and Grok. You can now setup connections between any of these systems and send data to Knack using any of these systems. You can also send data from Knack, have AI analyze the data and send back into your Knack database.
The app also has a full scheduler so you can define your connections using presets (API to Knack, SQL to Knack, etc) and then schedule your preset to run on a specific schedule. You can specify the age of the record to send as well - IE. Records 30 minutes or newer, sync records as they change, every 12 hours, one time download, etc.
The overall idea is to setup your preset connections and then schedule them out to run as needed. One task can download all records from a specific SQL table while another task is connecting to an API and downloading records as they change. You can have multiple connections running and scheduled.
The application keeps a running log of all activity and is very stable and run constantly.