DS News

MortgagePoint November 2025

DSNews delivers stories, ideas, links, companies, people, events, and videos impacting the mortgage default servicing industry.

Issue link: http://digital.dsnews.com/i/1541210

Contents of this Issue

Navigation

Page 55 of 83

MortgagePoint ยป Your Trusted Source for Mortgage Banking and Servicing News 54 November 2025 F E A T U R E S T O R Y corporate behavior, based on a review of those documents. AI not only can review documents and provide alerts re- garding certain types of transactions, but AI can then curate those results even further so as to narrow the results to exclude superfluous or unwanted data, such as may be the case when I wish to eliminate regular expenses and activity from transactions being identified. But with great power comes great responsibility. Or, more on the nose, with an "all-knowing" bot built on potentially biased data and logic comes the need for careful attorney oversight and a healthy dose of suspicion. For instance, I could ask an AI bot to review the documents and alert me to certain types of transactions that may produce a large amount of results, so I could then ask that only certain amounts be flagged and regular expenses and activity be excluded. These prompts could create a very specific list of transactions for one group of people, while ignoring problematic monthly transactions for lower-income debtors, thereby uninten- tionally creating a bias in favor of higher earners. In the abstract, the above scenario sounds a bit like word soup; but let's look at a more detailed example: Let's say a debtor, Alex, typically runs out of money before receiving his paycheck so his roommate, Kyle, is responsible for buying the groceries, paying the bills, paying for meals, and even making Alex's car payment every month. Once Alex's paycheck arrives, he then trans- fers the money owed to Kyle directly to Kyle's bank account. The transaction history is apparent from a simple review of the financial statements themselves, but, since this is a recurring trend, an AI bot is less likely to see anything wrong with the activity, and therefore would not alert an attorney to the significant is- sue. The oversight could ultimately lead to Alex (being a lower-income client) being put into a class that is much more likely to have his discharge challenged or one that would require him to pay ad- ditional funds to the bankruptcy trustee. AI systems can also assist attorneys in screening for eligibility for Chap- ter 13 versus Chapter 7 bankruptcies, but should we let them? The eligibility for Chapter 7 bankruptcy, which is determined using an analysis called the means test (which is, in essence, an income threshold with a large grey area for individual circumstances such as family size, amount of secured debt, charitable contributions, child support, etc.), which can raise the amount a person is permitted to earn and still be able to qualify for Chapter 7. Performing a means test accurately can take time, and a filed means test will undergo significant scrutiny from creditors and various trustees, including Chapter 13 Trustees and the United States Trustee. To put it simply, this analysis cannot be wrong. However, because AI knowledge is based on historical data that has been provided to it, such as prior cases filed and its analysis of that data, AI can and frequently does draw incorrect conclu- sions based on irrelevant variables. AI may analyze data, such as ZIP codes, historical income, debt profiles, and asset types, and while these information points are seemingly neutral, the data is used in an analysis of socioeconomic status, inherently leading to biased out- comes for debtors with lower incomes or those residing in economically disad- vantaged neighborhoods. As referenced earlier in this article, AI tools can also be used by debtors, creditors, or trustees to predict success in a case. However, AI trained on histor- ical filing data may inherit problematic patterns and conclusions that could be harmful if allowed to permeate through every task the AI bot performs. AI Models that look at ZIP codes, consider a debtor's self-employment status, and evaluate items such as income brackets, car ownership status, or utility arrears, may unknowingly find a correlation with financial instability, thereby lead- ing an AI bot to recommend the wrong chapter of bankruptcy. Additionally, if low-income debtors have historically had higher dismissal rates or lower With an "all- knowing" bot built on potentially biased data and logic comes the need for careful attorney oversight and a healthy dose of suspicion."

Articles in this issue

Archives of this issue

view archives of DS News - MortgagePoint November 2025