Don’t get me wrong I have known the java language since 1.2 and built a few things here an there with swing and applet. Plus I’m aware of the JVM implementations like jython/jruby and why jars and the jvm are suppose to be good.
But everytime I go to take a serious look at it there’s doesn’t seem to be an actual reason to use java.
So, How can one take BDD practices (like those used in node/python/ruby/c++) and apply that to java.
Which technology make it still a viable use and what one can do to jump into it other than “download x ide that just calls gradle to do all the heavy lifting”?
I’m asking with no options or bias and seriously wanting to know why and what technologies even make it still relevant other than android, elastic, minecraft modding, jdbc, selenium, hadoop, tomcat, and the ever bloated spring framework/eclipse?
Based on my limited experience, Java looks to be pretty thoroughly entrenched in the *nix world doing server-side automation - Tomcat doesn’t look to be going away anytime soon.
It’s frequently taught even in high school level computer science courses. Really easy to wrap your head around a lot of concepts, particularly in object-oriented, in Java.
Not that I necessarily think it’s any excuse to not pick up a different language afterwards, but if the basis for your skills is already there, and the demand is there for Java applications…
I’m a site reliability engineer by trade. I am painfully familiar with the fact that it doesn’t matter what language your devs know; if you get enough devs together, any code will be impossible to maintain.
keep people writing impossible to maintain cowboy code
I’ve seen that too much in the wild and it really seems to be a part of the heart of my question. Anytime one does a search for anything Java they’re just told to use eclipse and some arbitrary other language in the jvm then presented with a bunch of “enterprise” solutions that are heavily written for non-techie managers and marketing types.
One reason that monolithic applications are moving bit by bit by the way side for large applications. Micro-services are starting to take the lead. This way each group can work independently in their own languages and have it all function as a coherent whole.
The cluster of applications I support at work use Tomcat as a critical function. I know it’s key to the batch interfaces that handle requests from upstream. It runs on the downstream nodes as well, but I’m not as versed on that since it’s vendor black box.
It’s also easier to containerize and scale an application efficiently when you break down it down into services. Though, when you do this, someone has to maintain the mechanism through which the services communicate to each other- it can increase the code maintenance load if you’re not careful in your engineering.
Anyone can come up with a million and one reasons not to use X. The question is for someone that’s on the outside of the “java inner circle” looking in.
At both my former employer and my present one, the f__king middleware is the source of at least as many problems as the core applications themselves. Bring on the microservices model with commom databases between them - maybe we can retire some of the profligate legacy systems in the process.
I use Java every day with Software Test Automation. You can use Python, C#, and others but I find in testing automation Java adoption is much higher. I can’t think of a tool that I couldn’t use with Java, but I can think of plenty that do not support Python, and a few that don’t support or under support C#.