Skip to main content

I have an Acumatica partner that I do some customizations for.  After doing a customization, they want me to do some data clean up because there are fields in a custom table that need to be back filled and it would be WAY easier for me to update my custom tables in SQL Query Analyzer.  I could write "one time” scripts and put them in a customization project, but without visibility to see if the script did what I wanted, and without access to be able to query the tables, it is a mind numbing exercise to have the partner run the script in the project, create an extract and send it to me.  I’m sure you know what I am talking about here.

I have my own databases hosted on Azure for my primary business and I can connect to them from my local PC, but the setup necessary to do that is not easy and requires some firewall changes and setting up Active Directory accounts.  

So, is there any way I can get access to run queries on SQL databases hosted at Acumatica, the cloud etc.?  If anyone has done this, I’d appreciate any advice.  I know this is a vague question and I am not asking for specific details, but it would be great to know if anyone has even done it.

Thanks,

Joe

@joe21 that would be extremely valuable. I wouldn’t count on Acumatica to ever open up direct connections to the SQL back-end, however I can envision an extension to Acumatica that basically gives us a rudimentary SQL command shell that we can use to run selects, updates, etc. from the web browser.

That was one of the ideas behind the “AcuShell” project we did at the Acumatica Summit Hackathon 2 years ago -- it runs in the browser and allows you to run arbitrary C# code, but I can envision extending this to work with pure SQL as well. Demo: https://recordit.co/hDfwaTJg4L


Thanks Gabriel.  It is actually just good to know that I am not alone in wishing this was an option.  If it was and I did not know that, I’d be kicking myself later on.


In theory you could create a JoesLog table. Then run your scripts and have them send output to your table.  That would let you see/query the results of your scripts and then you can run one more script to zap your table.


Hello Joe,

 

You may want to explore the following alternative:

1- Create a Data provider for SQL Server : this provider would specify identical tables on your SQL server, or even map to new tables/views that would better fit your analysis.

2- Create an Export Scenario based on the above Data Provider;

3- Run this Export Scenario once for a full copy (first run)

3- Change the Export Scenario for updates since last run.

4- Schedule the above scenario to run as often as you need. (careful here if the volume of rows and and/or updates are very high).

Note: on the machine running your SQL Server , open the port 1433 for TCP and in the scope tab add the IP address of the Acumatica instance which will run the Export Scenario. This will allow Acumatica to access your SQL Server and update tables.

You can then use Analyser on your tables which are copies of the production tables in Acumatica. 

Let me know if that works for you.

 


I like the idea, but Export Scenarios work at the graph/DAC level, which is one layer above the database. For debugging purposes this is not ideal in my opinion :( 


Hi Gabriel, 


I see your point ! 

If your customisation is about data clean up only then once you analyse the tables and decides what has to be updated/deleted then you can use scheduled import/export scenarios to update or delete data/rows in those tables. 

It is really complex(at least for me!) to work directly with db tables and views in Acumatica. If you miss one of the dependencies then you have a big trouble to sort out. Also, keys for entities/docs are kind of tokens (not easy to understand where they come from and how they are created). On the other hand, the use of scenarios let DAC solve the complexity of links of all related tables/views and you reduce to a minimum, if not to zero, the risks of introducing integrity issues. 
 

Also, this solution would make your solution more version independent. But if I understood correctly, you need to do the clean up just once in the current release, and never again. 

 

But if you need direct access to the tables/views on an instance running on SaaS, then this is another story and I have never heard of someone doing that. You may want to consider duplicating the instance in a local environment, testing your scripts and once satisfied , you could import your scripts as a customisation project.


Check out Tim Rodman’s: AugSQL • AugForums.com


Reply