
By Nate Raymond
Jan 15 (Reuters) - A federal judicial panel's proposal to regulate the introduction of artificial intelligence-generated evidence at trial received a lukewarm reception on Thursday from corporate lawyers and class-action attorneys, who called it a well-intended but premature attempt to address an evolving technology.
The U.S. Judicial Conference's Advisory Committee on Evidence Rules received that feedback at a hearing to consider a proposed rule, designed to ensure evidence produced by generative AI technology meets the same reliability standards as evidence from a human expert witness.
The proposal, if finalized, would be the first nationwide rule adopted by the federal judiciary to police the use of evidence generated by AI. The committee has been exploring since 2024 whether to adopt such a rule.
Several lawyers on Thursday said that while they appreciated the judiciary's effort to get in front of the issue, its draft rule was addressing a problem that may not yet exist, and that the proposal should be either reworked or scrapped.
"I really don't think that there is a need, a pressing need at this time, to go forward with your proposal," said Thomas Allman, the former general counsel of BASF Corp. "In addition, I'm really quite impressed with how well the courts seem to be doing without having any specific rules on this topic."
The reliability of the testimony of expert witnesses who rely on such technology is already subject to scrutiny under the Federal Rules of Evidence. But the rules do not currently cover what would happen if a non-expert witness used an AI program to generate evidence without any knowledge of its reliability.
Under the proposal, AI and other machine-generated evidence offered at trial without an accompanying expert witness would be subjected to the same reliability standards as expert witnesses, who are governed by Rule 702 of the Federal Rules of Evidence. The rule would exempt "basic scientific instruments."
Jeannine Kenney, a lawyer with the class action law firm Hausfeld, said it was hard for her to conceive of a litigator in a civil case trying to introduce machine-generated evidence without an expert witness already subject to Rule 702.
"We always want a human to vouch for that evidence," she said.
Robert Levy, who serves as executive counsel at Exxon Mobil, raised concerns about "ambiguities" in the rule as drafted.
When asked by U.S. District Judge Jesse Furman, a Manhattan-based judge who chairs the panel, if he thought adopting a rule was "premature," Levy said he had "mixed feelings."
"The landscape of what artificial intelligence technology will do and how that will affect civil justice or criminal justice cases is still being developed," he said.
Read more:
US judicial panel advances proposal to regulate AI-generated evidence
US judicial panel to develop rules to address AI-produced evidence
US judicial panel wrestles with how to police AI-generated evidence
US appellate judge calls bans on AI use by lawyers 'misplaced'
US Supreme Court's Roberts urges 'caution' as AI reshapes legal field