The Army Lawyer published a groundbreaking article in early 2026 titled 'Forging the Bimodal Judge Advocate: Human-Machine Integration and the Future of the JAG Corps' — and it's the clearest signal yet that military justice is preparing for AI integration. The article doesn't treat AI as a distant possibility. It frames human-machine integration as the operational future of military legal practice, positioning judge advocates as professionals who must be competent in both traditional legal skills and AI-augmented workflows.

But military courts operate under constraints that make civilian AI adoption look simple by comparison. UCMJ proceedings involve classified information, operational security requirements, and command authority structures that no commercial AI platform was designed to handle. When a JAG officer can't input case facts into an AI tool because they're classified at the Secret level, the entire civilian AI playbook becomes irrelevant.


UCMJ Proceedings and the AI Landscape

Courts-martial under the Uniform Code of Military Justice operate through a distinct procedural framework that creates unique AI challenges. Article 27 guarantees the right to counsel, and the military appellate courts — the Army Court of Criminal Appeals, the Navy-Marine Corps Court of Criminal Appeals, the Air Force Court of Criminal Appeals, and the Court of Appeals for the Armed Forces — conduct mandatory review under Articles 62, 66, 69, and 73 of the UCMJ. None of these courts have issued formal AI standing orders analogous to those proliferating in federal district courts. The Joint Service Committee on Military Justice, which oversees UCMJ procedural rules, hasn't addressed AI use in court-martial proceedings directly. This regulatory gap means JAG officers are making individual decisions about AI use without formal guidance — some are using commercial AI tools for legal research while others avoid them entirely based on security concerns.

Classification Constraints: The Showstopper for Commercial AI

The fundamental barrier to AI adoption in military courts is information classification. A significant percentage of courts-martial involve facts that are classified, controlled unclassified information (CUI), or subject to operational security restrictions. You can't input classified witness statements, intelligence assessments, or operational details into ChatGPT, Claude, or any commercial AI platform — full stop. Even unclassified military cases often involve personally identifiable information of service members, law enforcement sensitive material, or information protected under the Privacy Act. The DoD's cybersecurity framework requires that any system processing military information meet specific security standards, and most commercial AI platforms don't achieve the FedRAMP High authorization required for sensitive military data. This doesn't mean AI is unusable in military legal practice — it means the AI tools available to JAG officers are limited to either government-developed systems or commercial platforms approved for the specific data classification level involved.

The Bimodal Judge Advocate: DoD's AI Vision

The Army Lawyer article on the 'Bimodal Judge Advocate' outlines a vision where military lawyers operate in two modes: traditional legal analysis and AI-augmented workflows. This isn't theoretical — DoD has been investing in AI capabilities across all service branches, and the legal function is no exception. The concept envisions JAG officers using AI for legal research on unclassified matters, drafting administrative actions, analyzing court-martial data trends, and processing the enormous volume of military justice paperwork that consumes junior JAG officers' time. For managing partners at firms handling military defense cases (Article 6b victim representation, military commissions, or appeals to CAAF), understanding this evolution matters. Your opposing counsel — military prosecutors — will increasingly have access to AI tools optimized for military justice. Civilian defense counsel need equivalent capabilities, adapted for the classification constraints of their specific cases.

Military Commissions and National Security Cases

Military commissions — the tribunals handling cases against foreign combatants — represent the most extreme intersection of AI and security constraints. These proceedings involve classified evidence, special access programs, and national security information that cannot be processed through any commercial system. The Military Commissions Act and the Manual for Military Commissions establish procedural rules that predate the AI era entirely. There's no guidance on AI use in commission proceedings, but the classification barriers are so severe that practical AI adoption is effectively limited to unclassified legal research and publicly available precedent analysis. For civilian defense counsel appointed to military commission cases — typically experienced capital defense attorneys — the practical approach is to use AI only for unclassified research and writing tasks while maintaining strict information barriers between AI-assisted work product and classified case materials.

What JAG AI Policy Will Look Like

Based on DoD's broader AI strategy and the Army Lawyer article, military AI policy for legal practice will likely follow the department's existing risk-based framework. DOD Instruction 5400.19, which governs public affairs use of AI, provides a template: it requires human oversight, prohibits fully autonomous decision-making for consequential actions, and mandates documentation of AI use. Applied to military justice, this framework would require JAG officers to verify all AI-generated legal research, prohibit AI-drafted charging decisions or convening authority recommendations without substantive human review, and document AI use in case files. The timeline for formal JAG AI policy remains uncertain, but the cultural shift is already underway. Junior JAG officers entering service are digital natives who expect AI tools as part of their legal practice toolkit. The question isn't whether military courts will adopt AI — it's whether the policy framework will be in place before widespread unofficial adoption creates problems.

The Bottom Line: Military courts are the most constrained AI environment in the U.S. legal system — classification requirements, operational security, and the UCMJ's unique procedural framework create barriers that commercial AI platforms can't overcome. But the Army Lawyer's 'Bimodal Judge Advocate' article signals that DoD is preparing for AI-augmented military legal practice. Civilian defense counsel handling military cases should build AI capabilities for unclassified work while maintaining strict information barriers for classified materials.

AI-Assisted Research. This piece was researched and written with AI assistance, reviewed and edited by Manu Ayala. For deeper takes and the perspective behind the research, follow me on LinkedIn or email me directly.