Google is no longer recommending its proposal to make AJAX pages crawlable.
Previous to 2009, Google was simply unable to render and understand pages that use JavaScript to present content to users - essentially, pages that were created dynamically.
At the time, Google did provide a set of practices to ensure AJAX-based applications could be indexed by search engines, but today, as long as JavaScript and CSS files are not blocked (so Googlebot can crawl them), the search engine is able to render and understand web pages just like modern browsers.
Google continues to recommend against disallowing Googlebot from crawling a site's CSS or JS files and recently updates its technical Webmaster Guidelines to reflect this.
There's really no reason to abandon the AJAX crawling proposal if everything is working fine on your website, but Google does recommend following the principles of progressive enhancement, using the History API pushState(), for example, to ensure accessibility for a wider range of browsers (and Google's own system).