In [15]:
!pip install -U pydantic langgraph langchain langchain-core langchain-community ctransformers
Requirement already satisfied: pydantic in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (2.12.3)
Requirement already satisfied: langgraph in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (1.0.1)
Requirement already satisfied: langchain in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (1.0.1)
Requirement already satisfied: langchain-core in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (1.0.0)
Requirement already satisfied: langchain-community in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (0.4)
Requirement already satisfied: ctransformers in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (0.2.27)
Requirement already satisfied: annotated-types>=0.6.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from pydantic) (0.7.0)
Requirement already satisfied: pydantic-core==2.41.4 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from pydantic) (2.41.4)
Requirement already satisfied: typing-extensions>=4.14.1 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from pydantic) (4.15.0)
Requirement already satisfied: typing-inspection>=0.4.2 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from pydantic) (0.4.2)
Requirement already satisfied: langgraph-checkpoint<4.0.0,>=2.1.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langgraph) (3.0.0)
Requirement already satisfied: langgraph-prebuilt<1.1.0,>=1.0.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langgraph) (1.0.1)
Requirement already satisfied: langgraph-sdk<0.3.0,>=0.2.2 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langgraph) (0.2.9)
Requirement already satisfied: xxhash>=3.5.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langgraph) (3.6.0)
Requirement already satisfied: jsonpatch<2.0.0,>=1.33.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-core) (1.33)
Requirement already satisfied: langsmith<1.0.0,>=0.3.45 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-core) (0.4.37)
Requirement already satisfied: packaging<26.0.0,>=23.2.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-core) (23.2)
Requirement already satisfied: pyyaml<7.0.0,>=5.3.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-core) (6.0.1)
Requirement already satisfied: tenacity!=8.4.0,<10.0.0,>=8.1.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-core) (8.2.3)
Requirement already satisfied: langchain-classic<2.0.0,>=1.0.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-community) (1.0.0)
Requirement already satisfied: SQLAlchemy<3.0.0,>=1.4.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-community) (2.0.25)
Requirement already satisfied: requests<3.0.0,>=2.32.5 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-community) (2.32.5)
Requirement already satisfied: aiohttp<4.0.0,>=3.8.3 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-community) (3.9.3)
Requirement already satisfied: dataclasses-json<0.7.0,>=0.6.7 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-community) (0.6.7)
Requirement already satisfied: pydantic-settings<3.0.0,>=2.10.1 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-community) (2.11.0)
Requirement already satisfied: httpx-sse<1.0.0,>=0.4.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-community) (0.4.0)
Requirement already satisfied: numpy>=1.26.2 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-community) (1.26.4)
Requirement already satisfied: huggingface-hub in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from ctransformers) (0.34.3)
Requirement already satisfied: py-cpuinfo<10.0.0,>=9.0.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from ctransformers) (9.0.0)
Requirement already satisfied: aiosignal>=1.1.2 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (1.2.0)
Requirement already satisfied: attrs>=17.3.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (23.1.0)
Requirement already satisfied: frozenlist>=1.1.1 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (1.4.0)
Requirement already satisfied: multidict<7.0,>=4.5 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (6.0.4)
Requirement already satisfied: yarl<2.0,>=1.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from aiohttp<4.0.0,>=3.8.3->langchain-community) (1.9.3)
Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from dataclasses-json<0.7.0,>=0.6.7->langchain-community) (3.21.1)
Requirement already satisfied: typing-inspect<1,>=0.4.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from dataclasses-json<0.7.0,>=0.6.7->langchain-community) (0.9.0)
Requirement already satisfied: jsonpointer>=1.9 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from jsonpatch<2.0.0,>=1.33.0->langchain-core) (2.1)
Requirement already satisfied: langchain-text-splitters<2.0.0,>=1.0.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langchain-classic<2.0.0,>=1.0.0->langchain-community) (1.0.0)
Requirement already satisfied: ormsgpack>=1.10.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langgraph-checkpoint<4.0.0,>=2.1.0->langgraph) (1.11.0)
Requirement already satisfied: httpx>=0.25.2 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langgraph-sdk<0.3.0,>=0.2.2->langgraph) (0.27.0)
Requirement already satisfied: orjson>=3.10.1 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langgraph-sdk<0.3.0,>=0.2.2->langgraph) (3.10.15)
Requirement already satisfied: requests-toolbelt>=1.0.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langsmith<1.0.0,>=0.3.45->langchain-core) (1.0.0)
Requirement already satisfied: zstandard>=0.23.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from langsmith<1.0.0,>=0.3.45->langchain-core) (0.23.0)
Requirement already satisfied: python-dotenv>=0.21.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from pydantic-settings<3.0.0,>=2.10.1->langchain-community) (1.0.0)
Requirement already satisfied: charset_normalizer<4,>=2 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from requests<3.0.0,>=2.32.5->langchain-community) (2.0.4)
Requirement already satisfied: idna<4,>=2.5 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from requests<3.0.0,>=2.32.5->langchain-community) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from requests<3.0.0,>=2.32.5->langchain-community) (1.26.18)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from requests<3.0.0,>=2.32.5->langchain-community) (2025.8.3)
Requirement already satisfied: greenlet!=0.4.17 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from SQLAlchemy<3.0.0,>=1.4.0->langchain-community) (3.0.1)
Requirement already satisfied: filelock in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from huggingface-hub->ctransformers) (3.13.1)
Requirement already satisfied: fsspec>=2023.5.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from huggingface-hub->ctransformers) (2023.10.0)
Requirement already satisfied: tqdm>=4.42.1 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from huggingface-hub->ctransformers) (4.66.5)
Requirement already satisfied: anyio in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from httpx>=0.25.2->langgraph-sdk<0.3.0,>=0.2.2->langgraph) (4.2.0)
Requirement already satisfied: httpcore==1.* in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from httpx>=0.25.2->langgraph-sdk<0.3.0,>=0.2.2->langgraph) (1.0.5)
Requirement already satisfied: sniffio in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from httpx>=0.25.2->langgraph-sdk<0.3.0,>=0.2.2->langgraph) (1.3.0)
Requirement already satisfied: h11<0.15,>=0.13 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from httpcore==1.*->httpx>=0.25.2->langgraph-sdk<0.3.0,>=0.2.2->langgraph) (0.14.0)
Requirement already satisfied: colorama in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from tqdm>=4.42.1->huggingface-hub->ctransformers) (0.4.6)
Requirement already satisfied: mypy-extensions>=0.3.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from typing-inspect<1,>=0.4.0->dataclasses-json<0.7.0,>=0.6.7->langchain-community) (1.0.0)
In [2]:
from langchain_community.llms import CTransformers
In [3]:
from langchain_core.callbacks import AsyncCallbackManager
from langchain_core.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
from langchain_community.llms import CTransformers

# Set up callback manager
callback_manager = AsyncCallbackManager([StreamingStdOutCallbackHandler()])
In [43]:
!pip install huggingface_hub[hf_xet]
from huggingface_hub import hf_hub_download
import os

# Set your target directory
local_dir = r"D:\Open LLMs"

# Create the folder if it doesn't exist
os.makedirs(local_dir, exist_ok=True)

# Hugging Face repository and file name
repo_id = "TheBloke/Llama-2-7B-Chat-GGUF"
filename = "llama-2-7b-chat.Q3_K_S.gguf"

# Download the file to the specified local directory
local_path = hf_hub_download(
    repo_id=repo_id,
    filename=filename,
    repo_type="model",
    local_dir=local_dir
)

print(f"✅ File downloaded successfully!\nSaved at: {local_path}")
Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`
Requirement already satisfied: huggingface_hub[hf_xet] in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (0.34.3)
Requirement already satisfied: filelock in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from huggingface_hub[hf_xet]) (3.13.1)
Requirement already satisfied: fsspec>=2023.5.0 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from huggingface_hub[hf_xet]) (2023.10.0)
Requirement already satisfied: packaging>=20.9 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from huggingface_hub[hf_xet]) (23.2)
Requirement already satisfied: pyyaml>=5.1 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from huggingface_hub[hf_xet]) (6.0.1)
Requirement already satisfied: requests in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from huggingface_hub[hf_xet]) (2.32.5)
Requirement already satisfied: tqdm>=4.42.1 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from huggingface_hub[hf_xet]) (4.66.5)
Requirement already satisfied: typing-extensions>=3.7.4.3 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from huggingface_hub[hf_xet]) (4.15.0)
Collecting hf-xet<2.0.0,>=1.1.2 (from huggingface_hub[hf_xet])
  Downloading hf_xet-1.1.10-cp37-abi3-win_amd64.whl.metadata (4.7 kB)
Requirement already satisfied: colorama in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from tqdm>=4.42.1->huggingface_hub[hf_xet]) (0.4.6)
Requirement already satisfied: charset_normalizer<4,>=2 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from requests->huggingface_hub[hf_xet]) (2.0.4)
Requirement already satisfied: idna<4,>=2.5 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from requests->huggingface_hub[hf_xet]) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from requests->huggingface_hub[hf_xet]) (1.26.18)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\bb23u\appdata\local\anaconda3\lib\site-packages (from requests->huggingface_hub[hf_xet]) (2025.8.3)
Downloading hf_xet-1.1.10-cp37-abi3-win_amd64.whl (2.8 MB)
   ---------------------------------------- 0.0/2.8 MB ? eta -:--:--
   ----------------- ---------------------- 1.2/2.8 MB 37.4 MB/s eta 0:00:01
   ---------------------------------------  2.8/2.8 MB 44.2 MB/s eta 0:00:01
   ---------------------------------------- 2.8/2.8 MB 35.4 MB/s eta 0:00:00
Installing collected packages: hf-xet
Successfully installed hf-xet-1.1.10
llama-2-7b-chat.Q3_K_M.gguf:   0%|          | 0.00/3.30G [00:00<?, ?B/s]
✅ File downloaded successfully!
Saved at: D:\Open LLMs\llama-2-7b-chat.Q3_K_M.gguf
In [4]:
# Model config
config = {
    'max_new_tokens': 2048,
    'repetition_penalty': 1.1,
    'temperature': 0.7,
    'top_k': 50
}

# Load local model
llm = CTransformers(
    model=r"D:\Open LLMs\llama-2-7b-chat.Q3_K_S.gguf",
    model_type="llama",
    config=config,
    callback_manager=callback_manager,
    verbose=True
)
In [6]:
#Example 1
prompt = """
Highlight the most clinically important concepts in the following *Assessment and Plan* section by wrapping them in HTML <span> tags. 
Focus on diagnoses, medications, procedures, and follow-up plans that are essential for clinical decision-making.

Input text:
Assessment and Plan: 
Patient with known Coronary Artery Disease s/p CABG, presenting for post-op follow-up. 
Blood pressure remains elevated despite current beta blocker and diuretic therapy. 
Continue aspirin and statin. 
Initiate ACE inhibitor for improved BP control. 
Encourage dietary modifications and daily walking. 
Schedule repeat lipid panel in 6 weeks and cardiology follow-up in 2 months.
"""
response = llm.invoke(prompt)
print("\n\nResponse:\n", response)

Response:
 
Output:
<span>Diagnoses:</span> Coronary Artery Disease, Hypertension

<span>Medications:</span> Aspirin, Statin, Beta Blocker, Diuretic, ACE Inhibitor

<span>Procedures:</span> CABG (past)

<span>Follow-up plans:</span> Repeat lipid panel in 6 weeks, Cardiology follow-up in 2 months

Note: The output is a summary of the most clinically important concepts in the input text.
In [8]:
#Example 2
prompt = """
You are reviewing a physician's clinical note. 
provide a concise summary (2–3 sentences) describing the patient’s condition, key findings, treatment, and plan.

Input text:
Progress Note:
Mrs. Adams, a 72-year-old female with a history of chronic obstructive pulmonary disease (COPD) and hypertension, 
presented with worsening cough and dyspnea. 
O₂ saturation on arrival was 88%. 
Chest X-ray revealed bilateral infiltrates consistent with pneumonia. 
Started on IV antibiotics (ceftriaxone, azithromycin) and supplemental oxygen via nasal cannula. 
Continue home antihypertensives. 
Plan to reassess respiratory status and transition to oral antibiotics once stable. 
Follow-up chest imaging in 2 weeks.
Please write a summary
"""

response = llm.invoke(prompt)
print("\n\nResponse:\n", response)

Response:
 Summary: Mrs. Adams, a 72-year-old female with COPD and hypertension, presented with worsening cough and dyspnea. Her oxygen saturation was 88% on arrival. A chest X-ray revealed bilateral infiltrates consistent with pneumonia, and she was started on IV antibiotics and supplemental oxygen via nasal cannula. She will continue to take her home antihypertensives and follow-up imaging is planned in 2 weeks.
In [10]:
#Iteration over a JSON file
import json
# Load JSON notes
with open("D:\Open LLMs\clinical_notes.json", "r") as f:
    notes = json.load(f)
notes[0]
Out[10]:
{'patient_id': 'P001',
 'note': 'Mr. Thomas, a 68-year-old male with diabetes and hypertension, presented with dizziness and elevated blood sugar. Started on insulin glargine 10 units nightly and advised low-carb diet. Scheduled endocrinology follow-up in 1 month.'}
In [11]:
# Define prompt template
prompt_template = """
Please do two things:
You are reviewing a physician's clinical note. 
provide a concise summary (2–3 sentences) describing the patient’s condition, key findings, treatment, and plan.

Input text:
{note}
"""

# Iterate through all notes
results = []
for entry in notes:
    patient_id = entry["patient_id"]
    note_text = entry["note"]
    prompt = prompt_template.format(note=note_text)

    print(f"\n🩺 Processing Patient {patient_id}...\n")
    response = llm.invoke(prompt)

    results.append({
        "patient_id": patient_id,
        "input_note": note_text,
        "model_output": response
    })

# Save results to a new JSON file
with open("D:\Open LLMs\clinical_results.json", "w") as f:
    json.dump(results, f, indent=2)

print("\n✅ Processing complete! Results saved to 'clinical_results.json'.")
🩺 Processing Patient P001...


🩺 Processing Patient P002...


🩺 Processing Patient P003...


✅ Processing complete! Results saved to 'clinical_results.json'.
In [12]:
results[0]
Out[12]:
{'patient_id': 'P001',
 'input_note': 'Mr. Thomas, a 68-year-old male with diabetes and hypertension, presented with dizziness and elevated blood sugar. Started on insulin glargine 10 units nightly and advised low-carb diet. Scheduled endocrinology follow-up in 1 month.',
 'model_output': '\nYour response:\nMr. Thomas is a 68-year-old male with a history of diabetes and hypertension who presented to the clinic with symptoms of dizziness and elevated blood sugar. His clinical note indicates that he was started on insulin glargine 10 units nightly and advised to follow a low-carb diet. He is scheduled for an endocrinology follow-up appointment in one month to monitor his progress.'}
In [16]:
from langgraph.graph import StateGraph, START, END
from langchain_community.llms import CTransformers
from langchain_core.messages import HumanMessage
import json
In [27]:
# ----------------------------
# Step 1: Define your model
# ----------------------------
llm = CTransformers(
    model=r"D:\Open LLMs\orca_mini_v2_13b.ggmlv3.q2_K.bin",
    model_type="llama",
    config={"max_new_tokens": 512, "temperature": 0.7},
    callback_manager=AsyncCallbackManager([StreamingStdOutCallbackHandler()]),
    verbose=False
)

prompt_template = """
Please do two things:
1. Wrap important medical entities (diagnoses, findings, medications, and follow-up plans) in HTML <span> tags.
2. Provide a concise summary (2–3 sentences) of the note.

Input text:
{note}
"""
In [28]:
# ----------------------------
# Step 2: Define state structure
# ----------------------------
class ClinicalState(TypedDict, total=False):
    patient_id: str
    note: str
    prompt: str
    output: str
In [29]:
# ----------------------------
# Step 3: Define graph nodes
# ----------------------------
def read_note(state: ClinicalState):
    """Create prompt text from the note."""
    formatted_prompt = prompt_template.format(note=state["note"])
    new_state = {**state, "prompt": formatted_prompt}
    return new_state

def run_llm(state: ClinicalState):
    """Run LLM on the prompt."""
    prompt = state.get("prompt", None)
    if not prompt:
        raise ValueError("Missing 'prompt' in state. Ensure read_note() ran first.")
    response = llm.invoke(prompt)
    return {**state, "output": response}

def show_output(state: ClinicalState):
    """Display the output neatly."""
    print(f"\n🩺 Patient {state['patient_id']} Result:")
    print(state["output"])
    return state
In [41]:
# ----------------------------
# Step 4: Build LangGraph
# ----------------------------
graph = StateGraph(ClinicalState)

graph.add_node("read_note", read_note)
graph.add_node("run_llm", run_llm)
graph.add_node("show_output", show_output)

graph.add_edge(START, "read_note")
graph.add_edge("read_note", "run_llm")
graph.add_edge("run_llm", "show_output")
graph.add_edge("show_output", END)

app = graph.compile()
print(app.get_graph().draw_mermaid())
---
config:
  flowchart:
    curve: linear
---
graph TD;
	__start__([<p>__start__</p>]):::first
	read_note(read_note)
	run_llm(run_llm)
	show_output(show_output)
	__end__([<p>__end__</p>]):::last
	__start__ --> read_note;
	read_note --> run_llm;
	run_llm --> show_output;
	show_output --> __end__;
	classDef default fill:#f2f0ff,line-height:1.2
	classDef first fill-opacity:0
	classDef last fill:#bfb6fc

In [31]:
with open("D:\Open LLMs\clinical_notes.json", "r") as f:
    notes = json.load(f)
notes[0]
Out[31]:
{'patient_id': 'P001',
 'note': 'Mr. Thomas, a 68-year-old male with diabetes and hypertension, presented with dizziness and elevated blood sugar. Started on insulin glargine 10 units nightly and advised low-carb diet. Scheduled endocrinology follow-up in 1 month.'}
In [32]:
for entry in notes:
    initial_state = {"patient_id": entry["patient_id"], "note": entry["note"], "output": ""}
    app.invoke(initial_state)
🩺 Patient P001 Result:

Output:
<span>Mr. Thomas, a 68-year-old male with diabetes and hypertension, presented with dizziness and elevated blood sugar.</span> <span>Started on insulin glargine 10 units nightly and advised low-carb diet.</span> <span>Scheduled endocrinology follow-up in 1 month.</span>

🩺 Patient P002 Result:

Output:
Ms. Rivera, a 45-year-old female with asthma, was experiencing increased wheezing and cough. She received nebulized albuterol and oral prednisone from her physician. The plan is to taper steroids and continue taking an inhaled corticosteroid. Follow-up with the pulmonology clinic in two weeks.

🩺 Patient P003 Result:

Output:
<span class="important">Mr. Patel, a 59-year-old male post coronary stent placement, presented for routine follow-up. His blood pressure is well-controlled with aspirin and atorvastatin. The provider encouraged cardiac rehab and lifestyle modification.</span>


To summarize, the note describes a follow-up visit for a 59-year-old male who had a coronary stent placement. The patient is taking aspirin and atorvastatin and has well-controlled blood pressure. The provider encourages cardiac rehab and lifestyle modification.
In [ ]: