Episode 60

Using AI Without Losing Discernment

Show Notes

In this episode, Camille McDaniel discusses the growing influence of artificial intelligence (AI) in mental health care, particularly for Christian mental health professionals. She emphasizes the importance of discernment and spiritual wisdom when integrating AI tools into practice. The conversation covers the potential benefits of AI, practical applications for counselors, and the ethical considerations that must be taken into account. Camille provides actionable steps for using AI effectively while maintaining a faith-based framework.

Podcast Episode Transcript

Camille McDaniel (00:01.544)
Welcome back. Welcome back to another episode of Christ in Private Practice podcast. I am so glad to be talking to you today. I’m really excited about this topic. I always kind of geek out about certain things like this, but we are going to be stepping into a conversation today that is actually becoming more and more prevalent as far as talk. And it actually is a little bit more urgent. We need to give it some

some thought, we need to pay attention to it. It is here. And so we’re gonna be talking about it because it impacts you as Christian mental health professionals. There’s no question at all that artificial intelligence is shaping mental health care and shaping the way that we work. And so as therapists who some of us are using AI tools, I mean, they really are built into a lot of what we are using today. For example, many of us,

have electronic health records that we are using in order to keep track of our client appointments and our billing. And it’s using artificial intelligence to sometimes help us with treatment planning and developing different treatment plans. Sometimes artificial intelligence is embedded into an EHR to help with notes. so there are many other tools. There are some tools that we use in order to see our clients virtually. And they have

artificial intelligence kind of integrated. I know that I use Zoom, the one with obviously the HIPAA compliant version of it, but I use Zoom and it absolutely has artificial intelligence kind of woven all into it now, some of the newer things that I’ve seen on the platform. And there are many other ways that it’s being used today by mental health professionals and also by clients. So, you know,

Therapists are just using it for all kinds of things, sometimes to summarize research and we’re streamlining paperwork and all kinds of things to support our clients. But you know, here’s the rub. Here’s the rub. AI is very powerful. It’s in a lot of places integrated into a lot of the things that we are using today, even our emails, right? And it’s very efficient.

Camille McDaniel (02:25.206)
I mean, really and truly, you all know, no matter what you have used AI for, you know how efficient it is, you know how fast it can be. And so if we’re not careful, we can unintentionally, as we are using it, and I think this is what causes people concern, and even sometimes why people kind of step away from it, but today, we’re gonna be talking about it in ways that you won’t necessarily feel so uncomfortable if you’re not using it.

And if you are using it, it’ll just give you a stronger framework for the use. But getting back to what I was going to say before I interrupted myself, if we’re not careful, we can unintentionally replace discernment with automation. And we can replace wisdom with speed. And we can replace seeking the Lord with seeking the easiest available tool.

So today we’re going to be talking about how to use AI with discernment, how to build support tools for yourself and for your clients ethically, and how to remain grounded in wisdom. You know, the kind of wisdom that does not come from technology, but it comes from the Lord. And scripture reminds us, you know, in Proverbs two and six, for the Lord gives wisdom.

from his mouth comes knowledge and understanding. And we’re also told in James chapter one, verse five, if any of you lacks wisdom, let him ask the Lord and he will give to all liberally and without reproach and it will be given to you. So AI can be a tool.

AI can support the work that you’re doing and it can help you to execute assignments efficiently and quickly, but AI is not the source of the wisdom that leads to the assignment. AI is not the source of the assignment, right? So today we’re gonna talk about how to keep that order right. First God, then discernment.

Camille McDaniel (04:48.478)
wise stewardship of any tools that we have at our disposal, including AI. So then with that being said, let’s jump right on in to the rise of AI in mental health care. it’s here, right? You know, before we even talk about discernment, let’s acknowledge the reality that AI is already impacting mental health care in a massive

massive way. It really is. So one of the things that we are going to look at are some statistics about it. And so that way we can kind of have a little bit of a foundation for how AI really is entering this space. And then we’ll go from there. So the first thing that we want to look at is the fact that more people are using AI for mental health support than ever.

There was a 2020 study that was published in JAMA. I don’t know if you say JAMA, JAMA, but it stands for the Journal of the American Medical Association. So Journal of the American Medical Association Network Open found that 13.1 % of US adults, which is about 5.4 million people,

have used generative AI for mental health advice, and many of them used it monthly or more often. And that came from an article titled, Prevalence of Using Generative Artificial Intelligence Chatbots for Mental Health Advice Among US Adults. And that article came out, again, that was 2024 from the…

Journal of the American Medical Association, otherwise known as JAMA or JAMA. The next thing as it relates to AI in mental health care is that AI has potential benefits for actually early detection and monitoring. So again, a 2025 review.

Camille McDaniel (07:08.986)
in BMC psychiatry, which BMC stands for Biomed Central. And so in that journal, BMC psychiatry, it reported that AI tools can help detect early signs of depression, anxiety, suicidal and suicidal intentions through pattern recognition in speech and behavior. And that article was titled

AI-based tools for mental health assessment, a systemic review. And again, that came out in 2025. you you can kind of look those two different ones up. But the next thing then that we want to know about AI in mental health space is that therapists are torn about artificial intelligence.

Who question that? We are, you know, we tread lightly. We tend to tread lightly. We want to be very careful about any new things that are entering our space. We want to make sure that we are ethical. And so it’s understood. But this was actually a kind of like a survey that was done by Alma. you know, Alma is one of the, actually one of the large companies that

that provide mental health care. I believe they were one of the tech companies that entered the mental health space and now provide care and definitely with using like integration, I believe of AI within their systems. But a national survey by Alma found that 62 % of therapists believe that AI could reinforce bias in mental health care. And many are concerned about safety and ethical use.

And that survey was titled, therapists share their views on AI. And again, that was in 2024, in case I didn’t mention the year. The next thing and the last thing for like the statistics and this foundational framework that we are putting together right now is that AI is not fully trusted for more serious concerns. So Loma Linda University Health.

Camille McDaniel (09:35.008)
released an analysis highlighting that AI tools cannot replace licensed clinicians for trauma, crisis, or complex mental health issues because of risks in accuracy and lack of relational depth. So that’s something to also consider when we are finding ourselves feeling a little unsure about how AI might encroach in our space.

whether or not we’re still going to have potency in this space, we will. There are limitations. AI can do a lot, but it can’t do everything. And that article title was, The Truth About Turning Chatbots Into Therapists. And that was a 2024 article, again, by Loma Linda University Health. So AI is here and it is growing fast. It definitely has benefits.

And it has dangers, but as Christian, you know, mental health professionals, we have an added layer. Where does discernment sit, you know, when technology becomes powerful? And how do we partner with AI without losing the spirit-led wisdom that guides our decisions? So let’s jump into that and talk about that, the difference between seeking God and using a tool. Again.

You may feel like, I know the difference between seeking God and using a tool, Camille, and I know you do. You know, you all are highly intelligent, highly accomplished professionals, but just bear with me here, especially in a day and time where you have individuals who are trying to interpret tongues by using chat GPT. Let’s just dive on in. Just bear with me, okay?

So before we even talk about actionable AI steps, because I am going to kind of give you some steps, some prompts that you can actually use right after you finish listening to this podcast. But first, let’s go ahead and lay another foundation. So there’s spiritual order that we want to use. And the order that we want to use when we are even talking about stepping into using AI is that

Camille McDaniel (11:57.066)
We want to recognize that first, God gives the assignment of what we need to be doing at certain seasons of our life. God gives us the wisdom. And then you are going to discern what next steps based on the assignment that you have been given and the wisdom that you’ve been given. And then you’re going to execute that with tools. And of the many tools at your disposal,

AI is one of the tools that you have at your disposal to use. The danger is when we flip that order and AI becomes our starting point and instead of, you know, the Lord, because God gives the direction instead of technology giving us the direction and then telling us, you know, what to think and then helping us execute as well. So that’s all we just want to make sure that the foundation

is right. So for example, you might pray and feel led to create some new type of resource for anxious teens. Well, you have prayed, you got the assignment that you need to be helping client tell your client tell who happened to be anxious, who happened to be teenagers. You know, maybe the Lord has you kind of going a little bit towards younger people so that you can help them in the beginning.

so that you don’t necessarily have to worry about undoing a lot of things in the end, you know, when they get to be much older. But either way, this is what you’ve been led to do. And now, after you have the assignment and the wisdom, AI can help you draft worksheets a lot faster for this population. It can’t tell you what God is asking you to create, but, you know, it can help you to create that thing

with a lot more speed and efficiency. So we are going to start out with using discernment. We’re gonna use conviction, allow ourselves to be convicted, allow ourselves to use spiritual discernment. We are going to pray, Lord, give me wisdom. And you receive direction and then AI is gonna help you to carry out the steps efficiently.

Camille McDaniel (14:21.794)
This is a good framework, a good grounded order for us to use so that we can step into using AI and we don’t have to worry that AI is here and is going to take over our minds in negative ways. We just have to have a strong framework that we’re going to use as we step into it. And so that’s where we can then start looking at helpful ways that Christian counselors can use AI today.

And so I’m going to go ahead and I’m going to give you some things to think about. And we’re going to talk about some prompts that you can throw right on into an AI program today. And there are many AI programs. So you have ChatGBT, which a lot of us are very familiar with ChatGBT. You have Google Gemini. A lot of people heard of Google Gemini. You have Grok. I have not used that one.

I that one was created by Elon Musk, I think. If I’m wrong, somebody kind of put a little note in there for me in the comment section. But that’s another one. And you have another one, not to get off track here or anything like that, but just for your own knowledge base, there is also a medical AI system called Perplexity. Perplexity is an AI system for medical issues.

actually good. I have used it a couple of times when I noticed certain symptoms coming up for clients and I was trying to determine, you know, what’s the severity? Should I refer them on to their doctor? Was this something that I just thought was serious and it’s not? And that was only in cases where a client may not have actually spoken to a physician yet and I was trying to determine whether that was an appropriate referral.

Back on track, let’s swing back over to what we’re going to be talking about right now with helpful ways that Christian counselors can use AI. And the tools that you might be leaning more toward would be like ChatGBT, Google Gemini, maybe even Grok. So AI will streamline, you know, the first thing that we’ll talk about now is AI will streamline administrative work.

Camille McDaniel (16:48.854)
Now it’s not going to replace clinical judgment, obviously, that would be unethical, but it definitely can help you streamline some of those admin tasks. And here are some examples of that, like drafting intake, some of your intake paperwork or some of your intake frequently asked questions, frequently asked questions that you may want to put on your website or you may want to put within paperwork.

It can definitely help you with creating psychoeducational handouts for your clients. It can help you with formatting worksheets, summarizing very long articles, because you can actually upload PDFs to, I know you can do that to ChatGBT and I believe Google, Jim and I as well. And it can help summarize things for you. Obviously,

And I just have to throw that out there. You are going to need to know what these articles say because artificial intelligence is a not without error. I just heard on, was it the radio or was I online, about some attorneys who I guess they turned in a brief. Now look, don’t get me saying this wrong, but whatever attorneys have to turn in as it relates to the case and what they’re trying to accomplish.

I think is called a brief. And so they turned in the brief, but they had done this with AI and there were a lot of mistakes and they got into some serious trouble because of it. So they obviously did not fact check. And that’s one thing whenever I am using AI and I need it to give me information, I make sure that it cites the source for me and I make sure I can go back and find it. So.

You want to just make sure that if you are using it for long article summarization, that you actually have a way to check and know what in the world it’s saying so that you know if you’re getting a proper summary. But otherwise, it can be super beneficial for that. It can also help you organize your treatment plans. Like I said, some of you are already using electronic health records, EHRs.

Camille McDaniel (19:05.994)
that are giving you the option to formulate treatment plans for clients using AI. So you can have AI help you organize the treatment plan wording, and then you go through and you edit it to make it more appropriate. But it’s a lot faster for you to edit than for you to have to create it step by step. So here’s a prompt I’m going to give you that you can throw right on in to one of those AI programs that I just mentioned.

before and you can do that today. have it, for example, type this in, create a plain language explanation of what cognitive behavior therapy is for a Christian adult client experiencing anxiety. Keep it educational and neutral. I will edit for clinical accuracy. There you go. You can put.

right into any AI program right now and you will be able to find that it creates an explanation. And if you want to go further, you can with that. And if you want to replace the word anxiety with anything that you specialize in, then you can do that as well. The next thing, number two, is that you can use AI to help you develop tools and resources and then you add the discernment and the theology behind it.

So again, an example of that would be like you can have it create journal prompts for your clients or a coping skills list. I know that there was a workbook many years ago that I was using regarding self-esteem and there was a list for different ways to help yourself enjoy life and to build yourself up. It was a coping skills list.

It was a little frustrating because some of the things on the coping skills list were not appropriate for all ages. So it would mention certain things that it was like, this is not really appropriate for me to share with a child or a teen. This is not going to be appropriate as a Christian mental health professional for me to promote them doing this particular activity. I mean, it was a really extensive list, but there was just a handful of things or more, a little more than a handful of things.

Camille McDaniel (21:27.768)
that I would have to black out when I would copy the list for clients that I wanted to give to. So in this way, you can make a list that’s going to be aligned with your clientele and it’s gonna be aligned with the way that you wanna operate as a mental health professional. So a coping skills list or you might even do a lifestyle habit tracker, you know, maybe there’s something you want your client to track and there’s not currently a tracker out there for that.

Additional things that you might be able to do and create grounding scripts helping your client to ground themselves Sometimes we all we have the breathing, you know We definitely have scripts for that for deep breathing and we have ones for grounding ourselves with you know things you can see things you can touch things you can smell, know the five things and Maybe you find that you need something else. Maybe it doesn’t necessarily work for your particular client and situation

So you can then create your own grounding scripts and see how they work. And if they continue to be successful, there you go. You’ve now tested it out and you have a tool that you can use with many other people. So you can also do Christian integration worksheets. You have to review it, make sure that the scriptures, the theology behind it is correct, but it can definitely put that out there for you. So here’s a prompt.

another one that you can use today and see what it produces for you. So put this in there. Say, draft a one-page worksheet titled, Finding Your Identity in Christ Rather Than Anxiety. Include journaling questions. Do not include scripture. I will add scripture afterward. Now,

If somebody wants to actually use this journal prompt and you want it to include scripture and then what you’re going to do is check the scripture in order to make sure that it is applied correctly, then you can also instruct it to back this by scripture using this version of the Bible so that way you can ensure that it’s a solid reference and that you can go to that reference that.

Camille McDaniel (23:44.664)
version of the Bible and make sure that it is being used, the context is correct for what you have it actually creating. You’re always going to have to do that. I remember creating something for a group that I was running for women and I had it giving me certain scriptures that I was going to then have to look back and check. But when I was looking back and checking for the context and just making sure everything was appropriate,

It gave me a scripture that was not even in the Bible. Absolutely made up. So I kind of typed back in into the artificial intelligence platform I was using, which at the time was chat GPT. And I was like, this is actually not a scripture in the Bible at all. And it wrote back something to the effect of, my bad. You’re right. And I’m like, what in the world? Where did you pull that from? So you absolutely.

need to make sure that you are checking that it is correct. Okay. So, the next one that we’re going to look at, and I’ll give you a prompt, is the fact that AI is good for professional development. Absolutely, because you can summarize new studies, you can draft outlines for trainings, you can turn transcripts of things into teaching points. So, for example,

this podcast or if you have a vlog, like if you’re vlogging a podcast or if you happen to be a blogger, you you can take any of the transcripts from the podcast, from the video, from, you know, what you’ve already written in your blog post. You can put that into AI and it can then take all of that material and turn it into teaching points for a workshop, for a training.

for a small manual that you wanna create if you want to go a little bit further on what you were talking about in any of those services or not services, but content rather, excuse me, that you created. Really, really cool. And this next one gets me super excited. You can even organize continuing education content that you have created. So you can, again,

Camille McDaniel (26:05.516)
You’ve already been given the wisdom. You’ve seen things that are going on. You know how you’re supposed to be delivering it to the population that you are wanting to serve. And you want to make sure that it is all right and in order and is going to give people everything that they need according to ethical codes and all of that. You can upload a lot of that into an AI program.

And it can help you to make sure that you are staying in line with the requirements of the continuing education body and make sure that you are staying within line of the different ethical codes. And again, you’ll have to double check because you would have to upload the ethical codes if you want AI to help you craft something, making sure you stay in line with those ethical codes. But yeah, it can definitely help you do so much so efficiently.

So here’s a prompt. Here’s something that you can go ahead and use. You type this one in, or tell it to provide a summary of the latest research on burnout among therapists from peer reviewed journals in the last three years. List the journal titles in the summary. There you go. All right. See what happens. See what comes up. Tweak it a little bit. Maybe you want research on something else, you know, but

Either way, you have a lot of leeway and AI can help you in that way to be very efficient. so the next one and the last one that I’ll give you regarding prompts and ways that AI can help you is that AI can help you to diversify your business ethically. So some example is like content ideas for your podcast or some course outlines for different courses that you want to offer.

trainings that you want to offer, blog post ideas, different outlines for that. Maybe even, maybe you want to give people prompts that they can then use in their daily life for prayer or for joy or for anything else, you know, that you want to be able to pass on to people. It can help you.

Camille McDaniel (28:29.248)
with some of your marketing copy, like not necessarily formulating a whole marketing plan for you, but like some of the copy, what you’re writing down or what you’re going to be sharing with other people in community spaces, it can help you to make sure that you are staying within the lines of ethics and integrity before you actually put it out there. That’s really important.

You know, it can help you in a lot of ways. Sometimes we get a little comfortable. We might think that something’s a great idea. We might think something is something we need to absolutely post. But then based on our code of ethics and based on, you know, integrity of the profession, AI might advise you to watch out. And, you know, these are the codes that you may find yourself actually breaking if you do a certain thing. So.

Again, it can be kind of a little helpful, obviously, if maybe you’re not able to reach a colleague immediately for consultation. So here’s a prompt that you can use and again, see what you get. Give me 10 podcast episode titles for Christian mental health professionals during the holiday season focusing on boundaries, burnout and faith.

You can obviously also do that one if you’re not podcasting or if you’re not writing a blog or not doing video, then you can have ideas then for your target market. So it might be moms under stress. It might be high conflict couples. It might be pastors and lay leaders. It could be anyone. that in the prompt and see what it has to offer.

The next thing we’re going to talk about after looking at all the wonderful things that can be done, we want to definitely not end this podcast without mentioning cautions. don’t want to go without mentioning the thing that we’re thinking about. It’s like, yeah, that sounds all wonderful. That sounds all great. But there are some things I’m not so sure about. Well, you know what? You are right to be a little bit unsure.

Camille McDaniel (30:49.09)
because there are some things we need to be careful about because AI is here to help. We don’t want AI to slip into a space where it’s replacing discernment if we’re not careful. So AI is helpful, but there are some real risks. And here are the top caution zones with regards to AI and us being discerning.

Number one, AI can sometimes if we’re not careful, we can find ourselves replacing spiritual discernment with AI generated answers because AI is confident. You know, it’s persuasive, it’s fast and sometimes it is wrong. AI doesn’t know God. It knows, you know, the scripture as far as written words.

but it doesn’t always understand it in context, right? An AI doesn’t feel conviction or hear the Holy Spirit. It is just a technological tool, okay? It’s not infused with relationship with the Lord and it hasn’t submitted itself to the Lord and it’s not in…

hearing from and in relationship with the Holy Spirit? No. So we don’t want to ask AI, what should I say spiritually? What is God’s will for his people here? Like we don’t want to ask AI for spiritual guidance. And again, for those who might be like, well, yeah, I know that, right? Remember I said earlier in this episode,

that there was an account of someone trying to use AI to interpret tongues. So just knowing that sometimes we can slip into those areas as human beings, we just want to be aware that we are not going to AI for spiritual direction. We’re not. We’re not going to say, what might I need to use for this client as far as biblical direction?

Camille McDaniel (33:08.782)
you know, even for our clients. We want to get all of that from the Lord. Those answers, they have to come from scripture, they have to come from prayer, and they have to come from discernment. You know, the next thing that we want to be careful about is that, like we talked about a little earlier with the survey that Alma did in 2024, therapists

do have a serious concern that AI might be just reinforcing certain biases and that fear is, is valid. We have to be careful. AI and some of these programs are a collection of people’s thoughts. There is some very fact-based information that AI has.

But then there are some that are not based on fact, but just opinion. And so AI just gathers it all together. I think I had to, once upon a time, I was trying to look some things up, just get some information. And I had to be really clear with AI on where I wanted it to pull its sources, like peer-reviewed journals, medical journals, mental health journals.

because it was pulling from places when I, and the only way you know if it’s pulling from places you don’t want it to pull from is you have to tell it to cite its sources. So if I ask for information, I say, and make sure to cite your sources when you’re giving me this, know, and then over time, I just had it rule out certain places that were not, they weren’t legit, you know, and it was oftentimes based on.

some opinion or information someone else could just kind of upload into a system on their own. The third thing that we want to be careful about with regards to AI is incorrect or unsafe mental health guidance. Again, the survey from Lola Loma, Linda University Health, warned about AI advice. It can be some

Camille McDaniel (35:29.006)
simplistic, but it also can be misleading especially for issues of trauma or crisis situations. So again, we want to be careful and this is something we can inform our clients about because our clients are more and more turning to AI for help, especially when they can’t get an appointment with their clinician right away. You know, maybe somebody saw you just yesterday.

And then something unexpected happened within the next 48 hours. And now they can’t get you because maybe it’s the weekend and you don’t work on the weekend. And they just want to know, OK, what do I do? And maybe they don’t have strong support systems and friends and family that they can turn to. There are a lot of clients who are like that at this season of their life.

their therapist happens to be one of the main supports that they have. They don’t have very strong or healthy family connections and maybe not a lot of friend connections. So we just have to be careful about that piece with regards to AI and trauma and crisis situations that require more relational information and background history and being able to synthesize all of that.

So we can definitely keep that in a kind of corner of our mind to inform even our clients. And then we also want to make sure that we are aware of privacy concerns. And I think that’s a lot of what mental health professionals are also concerned about in addition to a few other things. There was a review that kind of noted that client data

Process through AI tools can sometimes raise some privacy challenges. So we want to make sure that we’re looking looking at like studies public health studies and just you know, even like technology studies just to see what’s going on really with regards to privacy because those then as it relates to privacy concerns Well, that might be something where you don’t use AI, you know, if you can’t really determine whether or not

Camille McDaniel (37:48.27)
There’s the proper privacy structure in place to protect, you know, health information. We want to look at how this could sometimes de-skill the therapist. So literally, you know, you find yourself over time kind of not being as sharp in your clinical skills because when

as a therapist you rely on templates and shortcuts too often, then our core skills can weaken. And this is just how the mind just works. Do you guys remember how everyone used to remember telephone numbers? Back when telephone numbers actually didn’t require you to put in the area code, you just had a telephone number.

You didn’t have like, you know, whatever three digit area code that you came from, but you just had a regular old seven digit telephone number and you remembered everyone’s telephone number. was no cell phone to store your telephone number in. You remembered a lot of numbers. And now today, how many numbers do you remember? If you say you remember all of them, like, you know, your friends and your family and all.

You are in the minority. You are doing great. Excuse me. You’re doing great. Because people can’t remember anymore. If they lost their phone tomorrow, they might remember one number, two numbers, probably not five. So, you know, again, when we rely on technology to replace the skill that once was being held within our own mind,

and we don’t sharpen that skill while using technology or allowing technology to help us sharpen that skill, then our core skills can start to fade. so AI should actually like augment, know, not replace your clinical voice, your skill set, your relational attunement, you your ability to discern.

Camille McDaniel (40:07.99)
It should not replace that. Otherwise, you may notice that without AI, if the systems went down tomorrow, that you would not be able to be as effective. So we want it to help us, help it, you know, allow, use it so it can teach you more, so you can retain more information. Almost like, you know, as you’re researching, you’re retaining all of this that it is collecting for you, not allowing it to replace your clinical voice.

And so we’re going to essentially make sure as we kind of like round out our time, we’re looking at AI coming together with being a faith-based framework. So it’s a faith-based framework that we want to use as it relates to AI. And we want to use discernment with that. And here’s just the simple structure.

that we can all just use every time that we are reaching for AI for anything. The first thing is that we’re going to pray. Lord, give me wisdom. Remember, that’s from Proverbs 2, 6 and James 1 and 5. You we’re going to seek wisdom and the Lord gives wisdom liberally and freely. The second thing we’re going to do is we’re going to clarify the assignment. So what is God directing you to create? What’s the purpose? Who is it for?

And then the third thing is we’re going to use AI to support the assignment we’ve already been given. So never let AI guide your direction. You are going to get the direction and you’re going to tell AI what you need it to do for you. And then the fourth thing is we’re going to apply discernment. So we’re going to edit things. We’re going to look over things to make sure that this is the direction the Lord wants us to go in.

Yes, we may put the assignment into AI, but then AI might kind of take a twist and a turn and tell you to say things, do things, and you’re like, no, that’s not sitting right. That’s not where the Lord told me to go. All right, let’s do this instead. So, you you want to make sure you apply discernment so that you can adjust the context, make sure it’s theologically sound, make sure it’s ethically accurate.

Camille McDaniel (42:31.832)
You know, make sure that it aligns with your licensing board expectations. And then the fifth thing is that we’re going to review for safety. We’re going to review for bias and make sure that we are operating in integrity. So, you know, check to make sure that the content honors the scripture, that it’s clinically sound, that it maintains any confidentiality, but it should because you shouldn’t be putting like, you know, people’s names and all of that.

if you’re just like creating worksheets or creating trainings and things like that. Make sure that it aligns with best practices and reflects your voice, not someone else’s voice. So it’s coming from you, your thoughts, your direction. Make sure that it is respecting diverse client backgrounds. AI may not necessarily know all the nuances of that. So this framework that we’re talking about, keeps AI in its place as a tool, not

the source, not the guidance, because God remains the source, God remains the guidance. And so then when we look at all of this from today’s episode, what we know is that AI is not the enemy. It’s also not the savior, right? It’s a tool. That’s what it is. It’s a tool like many other tools. And tools are only as wise as the hands that are using them.

So my encouragement to you today is really simple. Use AI with discernment. Seek the Lord first and let the wisdom lead the work that you’re doing. And let technology serve the assignment to allow you to do it efficiently. We have not as much time always. And even with the time we have, we don’t necessarily want to use all of it. Just…

just working in our business because we also have been given families and we also have been given friends and so. So if you create something with AI this week, it would be really cool if you share that in the Christ in Private Practice Facebook group. I would love to hear how you’re using it. You know, tell us what you’re using or what you decided to do and how did you edit it and what have you learned from it and you know, who is it helping?

Camille McDaniel (44:53.602)
Yeah, absolutely. If you want to create additional AI prompts for Christian counselors, just let me know. Yeah, let me know. mean, I can create some prompts that are tailored to the practice side, the faith side, or even client support tools. So yeah, absolutely. Thank you for joining me again. It’s always wonderful to be able to bring these thoughts, this information to you.

And I pray that in all that you do, the Lord just continues to guide you with wisdom and with clarity and with peace. Until next time, God bless.