Skip to main content
Question

Upgrade Stability Concerns – Impact on Existing Functionality

  • February 25, 2026
  • 1 reply
  • 39 views

Forum|alt.badge.img

Acumatica has been consistently releasing new versions (R1 and R2) every six months, bringing valuable enhancements and new features. The pace of innovation is appreciated, and many of the new capabilities are genuinely useful.

However, from an implementation and consulting perspective, we are increasingly facing challenges during version upgrades.

In several upgrade scenarios, we have observed:

  • Existing functionality that previously worked without issue begins to fail or behave differently.

  • Customizations and reports requiring unexpected rework.

  • Performance inconsistencies post-upgrade.

  • Support responses indicating the issue is a “known issue” and will be addressed in a future release.

While we understand that no system upgrade is perfect, it becomes difficult to explain to customers why stable functionality in the previous version is impacted after moving to a newer release — especially when the resolution is deferred to the next version.

Frequent releases are beneficial for innovation, but stability and backward compatibility are equally critical for customer confidence and long-term adoption.

I would appreciate insights from the community on:

  • How others are managing upgrade risk mitigation

  • Best practices for pre-upgrade validation

  • Whether there are plans to strengthen regression testing for existing functionality before release

Looking forward to hearing others’ experiences.

1 reply

craig2
Pro I
Forum|alt.badge.img+4
  • Pro I
  • February 25, 2026

My perspective is as an end-user/admin, but I feel like our upgrade process is pretty robust.

As a preamble, there was a pretty solid session on this topic at Summit, and the slides can be found in the post below.  Definitely worth a look (and Summit is a blast if you’ve haven’t gone).

https://community.acumatica.com/other%2Ddeveloper%2Dtopics%2D290/summit%2D2026%2Dbreakout%2Dpresentations%2Dtechnical%2D34414

 

First, it is definitely worth letting potential customers know what they are getting into with upgrades, even if on the surface it sounds like a negative for Acumatica.  Many organizations are used to installing their accounting software once, and it just exists and doesn’t change ever.  But the reality is technology strives to move forward, and a company that chooses to be future-focused has a finite amount of resources to support problems when they arise.  Thus the requirement to (eventually) upgrade, so everyone is playing roughly the same game.  Now, it is worth mentioning to customers that you don’t have to upgrade if you really don’t want to.  It’s just Acumatica is probably going to charge extra to look at a problem outside their support timeframe.  And any decent VAR should be able to provide first-level support regardless of the version their client is on.

 

As an admin, I know my upgrade season is around 6-8 weeks long (with a couple kickoff meetings beforehand).  A smooth upgrade is a coordinated effort between my VAR, myself, and my users.  I would say a sandbox environment is critical not just for upgrades, but for experimenting during the “off season” as well.

 

Roles and expectations are laid out very clearly, very early:

  • VAR is responsible for creating a fresh copy of production into the sandbox, validating/upgrading customization packages, executing both the test and production upgrade.
  • I am responsible for testing my own customizations, automations, and some basic transactions that cover the most common business scenarios.  If something breaks, I find the fix.  I may consult my VAR if I run into a bug or something I can’t figure out.
  • Users are responsible for testing the nuances of their own jobs.  This is communicated clearly to managers well in advance, and I’ll usually pop into team meetings to talk things over with folks.  We then set up a test schedule to best work around the different departments’ slower times, and managers are responsible for pushing users to test.

After a couple weeks of testing, I usually have roughly a page-long To Do list of things the upgrade breaks.  I schedule “Go-Live” for a Friday evening, during off hours, to minimize disruption.  I then spend however much of the weekend necessary to tidy up all the known issues.  This can take 2-8 hours, just depends on how dramatic the upgrade is.  My VAR will respond rapidly if I encounter an unexpected issue.  Then, come Monday morning, everyone boots up to a fancier Acumatica!  I like to use this time to change some of the login splash pictures as well, it makes it feel “new” to the users.

So really (like many things in life I suppose) it becomes a question of communicating clearly, early, and establishing ownership across the whole process.  Yes, things break and bugs are annoying.  We try to wait until there’s a Service Pack release before starting a test upgrade.  But a smooth upgrade requires heavy customer involvement and ownership, because they know the details of how things are supposed to work.  I’m usually pretty anxious the first Monday after an upgrade, but I’m generally just tapping my fingers and waiting for nothing.

Then on Tuesday, I get to start playing with my new toys!

Hope this helps a little, good luck out there.