QIBA CT TumorVolume TechnicalConfirmation ResultsandResolutions
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

Comment only
Proposed new language
Site Checklist
Site Conformance
Mark: Currently not conforming to all of the requirements in each section. Most items are feasible, but some are not, as indicated in below. f this intended to provide to a site to demonstrate conformance, it would be fairly challenging – to know the answers.
As a physicist, know the concepts and DICOM details but most sites without such a physicist or other knowledgeable staff, it would be tough without assistance.
Code WordsTally
Shall confirm all participating acquisition devices conform to this Profile.NFWN?These rows basically summarize the sections below.OK33
Shall confirm all participating reconstruction software conforms to this Profile.NFWN?Done53
Shall confirm all participating image analysis tools conform to this Profile.NFWN?TODO3
Shall confirm all participating radiologists conform to this Profile.NFWN?Discuss0
Shall confirm all participating physicists conform to this Profile.YRN?Gap0
Shall confirm all participating technologists conform to this Profile.NFWN?
Acquisition Device and Reconstruction Software Checklist
Product Validation (section 3.1)Realizing would be useful to document the make/model/version of acquisition device used. In many sites it will be a list.
Mark: GE750HD
Marthony: GE750HD
Add line to checklist to record conformance "subject" Make/model/version. Defer parameter settings to Conformance Statement.Done
Shall be capable of storing protocols and performing scans with all the parameters set as specified in section 3.4.2 "Protocol Design Specification".YRYRReworded to focus on "tech picking the stored protocol" rather than "performing" since that's in the next row already.Done
Shall prepare a protocol conformant with section 3.4.2 "Protocol Design Specification" and validate that protocol as described in section 3.4.2.YRNFWMark: Hoping to use the TG233 MTF software. Have some ImageJ tool but haven't validate so not sure if trust those.
Kevin: Between theline above and the reqs below for the physicist/radiologist, consider whether this row is redundant.
Shall validate that the protocol achieves an f50 value that is between 0.3 mm-1 and 0.75 mm-1. See section 4.1. Assessment Procedure: In-plane Spatial ResolutionYRNFWYRevise to between 0.3 mm-1 and 0.5 mm-1 to be able to drop the consistency requirement.Done
Shall validate that the protocol achieves:
· a standard deviation that is < 60HU. See 4.2. Assessment Procedure: Voxel Noise
Shall record in the DICOM image header the actual values for the tags listed in the DICOM Tag column in sections 3.4.2 "Protocol Design Specification".NFWYRMarthony: Image Comments/Patient Comments are not saved in header
Greg: Although it's theoretically possible that slight motion might elongage/compress a tumor, in the vast majority of cases, if there was motion, the impact will be clearly visible to the radiologist without annotations from the tech, so the Rad can disqualify if necessary at the QA stage.
Remove this requirement (on the tech and the scanner). Make sure there is a QA step to look for patient motion or other positioning issues.Done
Shall record actual timing and triggers in the image header by including the Contrast/Bolus Agent Sequence (0018,0012).NFNNNFMarthony: Do not do contrast enhance chest exams.
Mark: Triggers not recorded in DICOM header by scanner. Does include (all 0018): 1041, 1042, 1043, 1044,1046, 1047, 1049
Mark: use injection protocols paired with imaging protocol, but the trigger value and ROI placement might vary. Only total amount is recorded in patient record.
Jen: without automated recording this would be very hard for any site.
Dropped this requirement. The QA check to disqualify cases with inconsistent contrast enhancement between baseline and current timepoint prevents negative impact to repeatability.Done
Shall support recording in the image header (Image Comments (0020,4000) or Patient Comments (0010,4000)) information entered by the Technologist about the acquisition.NFWYRMarthony: Cannot find in header, waiting for physicist support to check if modality or PACS is dropping these tags. Result: Modality wasn't sending.
Kevin - will check with GE about support for these flags.
Mark: 0010,4000 isn’t part of DICOM header on syngo CT 2012B; Believe Image Comments IS supported so OK. Contrast volume (not timing) and other details can be recorded there.
Based on decision to remove the corresponding requirement in Image Acquisition, remove this requirement.

Avoid adding a PACS actor (required to keep tags) until we confirm there is a problem here
Shall be capable of performing reconstructions and producing images with all the parameters set as specified in 3.4.2 "Protocol Design Specification".YRNFWMark: I think some of our protocols aren’t setup to do the thins, so that would be changed – but the biggest thing is the high pitch acquisitions. NC since FW and pitch will be discussed in future editions by possibly using an Slice Sensitivity Profile instead.Done
Shall record in the DICOM image header the actual values for the tags listed in the DICOM Tag column in section 3.4.2 "Protocol Design Specification" as well as the model-specific Reconstruction Software parameters utilized to achieve compliance.NFWYRMarthony - this one is likely a yes.
Kevin: Difficult to reference the original document.
Kevin: Add checklist instruction to log reason for non-conformance if No (e.g. specific tags that are missing).
Kevin: Duplicate the list of tags between 3.4.2 and the requirement.
Image Analysis Tool Checklist
Product Validation (section 3.1)Mark: Checklist not completed; our Rads don’t typically measure volumes; so don’t have a specific tool we use at this time. We expect that to change later this year.
It's an involved procedure but looks feasible
Marthony: Tested the following three:
1. Syngo.via Client 3.0 03.00.0000.0013
2. Philips IntelliSpace Portal v6.0.3.12200
3. Aquarius iNtuition Edition ver.
Kevin: Ask in log which models failed if "NO".Done
Shall allow multiple tumors to be measured.YRNFWNCOK
Shall either correlate each measured tumor across time points or support the radiologist to unambiguously correlate them.YRNFWMarthony: Tool presents study pairs and user either segments a given tumor in both studies, or the baseline segmentation is presented and the user must find the matching tumor in the new study and segment it.NCOK
Shall be able to present the reader with both timepoints side-by-side for comparison when processing the second timepoint.YRNFWNCOK
Shall re-process the first time point if it was processed by a different Image Analysis Tool or Radiologist.YFWNFWHow do you determine what tool or radiologist did the first one? Or do you plan just to process both all the time?
The tool may be indicated in the radiologist report. If processed by the image analyst, the tool used would likely be indicated.
So you would ask people to access and review the prior report to see if the tool/version is listed and if so and it was different, would the the current rad/analyst switch to use that tool or would they reprocess?
Marthony: Might switch to the tool or reprocess.
Typically the SW version isn't recorded in the report.
Have to know you are using the same tool vendor (version is hard to track).
Reworded to be able to reprocess. Actually doing the reprocess is an Analysis time requirement.
DoneIn a clinical trial the two measurements will be performed on the same software and same operator. In a clinical practice setting the measurement will be performed by the same operator on the same software version at two timepoints if they cannot verify the software version, and person processing was the same at the baseline .
Shall be validated to compute tumor volume with accuracy within 3 % of the true volume.
See section 4.3 Assessment Procedure: Tumor Volume Computation.
NFWNFWRequirement was clear but some problems with assessment procedure.

See 4.3 down below.
Will recheck requirement as part of resolving procedure.Done
See section 4.3 Assessment Procedure: Tumor Volume Computation.NFWNFWOoops. Merge this row and above in the profile.Done
Shall be validated to achieve tumor volume change repeatability with:
· an overall repeatability coefficient of less than or equal to 16%.
· a small subgroup repeatability coefficient of less than 21%
· a large subgroup repeatability coefficient of less than 21%
See section 4.4. Assessment Procedure: Tumor Volume Change Repeatability.
NFWNFWRequirement was clear but some problems with assessment procedure.

See 4.4 down below.
After discussion with Nancy, no need for RCp in requirements and assessment procedure. Revise requirements to decimal form to match the metric of RC.Done
Shall be validated to achieve:
· an overall tumor volume %bias of less than the Allowable Overall %Bias
· a tumor volume %bias for each shape subgroup (spherical, ovoid, lobulated) of less than the Allowable Shape Subgroup %Bias
· slope ( between 0.98 and 1.02

The Allowable Overall %Bias and the Allowable Shape Subgroup %Bias are taken from Table 3.1.2-2 based on the overall repeatability coefficient achieved by the Image Analysis Tool using the assessment procedure in section 4.4.

See section 4.5 Assessment Procedure: Tumor Volume Bias and Linearity.
NFWNFWMarthony: This can be done. Please provide tools to perform calculations and reporting standards

See 4.5 down below.
No change to requirement specificationDone
Shall calculate and make available to the operator the 95% confidence interval for tumor volume change based on the equation:

Y1 and Y2 is the volume measured at timepoint 1 and 2,
wCV1 and wCV2 is the within-nodule coefficient of variation for Y1 and Y2 as taken from the following table,
D1 and D2 is the longest in-plane diameter of the volume at timepoint 1 and 2:

D1, D2 10-34mm 35-49mm 50-100mm wCV1, wCV2 0.141 0.103 0.085
YFWNFWMarthony: It can be done. An excel spreadsheet with formula would be useful. Tools for the calculation would need to be provided along with a standard reporting system.

Kevin: This is a requirement that a conformant reporting tool must calculate the CI and display it.
For Appendix A it looks like the volume change in cm3 is:
Tera +0.12 and CI=[ - 0.41 , 0.65]
Philips -0.15 and CI=[ - 0.77 , 0.47]
Syngo +0.09 and CI=[ - 0.60 , 0.78]
All consistent with no change.

Did the Image Analysis Tool provide the CI? If not then Conform should be No. Having the site use a spreadsheet is a practical workaround to get site conformance, but it does not mean the Tool is conformant.
Marthony: No, the tool does not provide the CI.

Nick: There may be issues with a tool displaying a confidence interval.
Kevin: The intention is that the tool lets the user configure a wCV, then the tool calculates the CI using that to save the user the trouble of opening a browser based calculator and transcribing the numbers to get the same result. This is not intended to be a claim of repeatability by the Analysis Tool alone.
Nick: That might be persuasive but FDA would need to think it through.
Replace the "shall" with a recommendation and a note explaining the potential FDA regulatory burden that might be imposed.

FDA needs to think about it. Nick will put that into the conversation.

QIBA CTVol can revisit in the next edition based on that outcome.
Shall record percentage volume change relative to baseline for each tumor.YFWNFWKevin: Where does the analysis tool record the %vol change, CI and SW version?
Marthony: There is no %vol change, and CI and SW recorded. This can be done in the reporting page during dictation, but its not done. Canned dictations can be changed at anytime to match or suit compliance standards of profile. SW change annually, and follow ups are done annually. SW version reporting would be a challenge since they are constantly changing. Please see the attached images where a segmentation tool is used to render both timepoints simultaneously for lesion comparison.
Assume that dictation will capture what is needed so the requirement on the tool can be dropped. May re-introduce it in future editions if radiologists decide that it would reduce errors or improve productivity if it did not depend on dictation/transcription.Done
Shall record the confidence interval of result for each change measurement.YFWNFWDittoDone
Shall record the image analysis tool version.YFWNFWDittoDone
Radiologist Checklist
Staff Qualification (section 3.2)
Shall, if operator interaction is required by the Image Analysis Tool to perform measurements, be validated to achieve tumor volume change repeatability with:
• an overall repeatability coefficient of less than or equal to 16%.
• a small subgroup repeatability coefficient of less than 21%
• a large subgroup repeatability coefficient of less than 21%

See 4.4. Assessment Procedure: Tumor Volume Change Repeatability.
NFWNNFMark: The sheer number of cases and the rapid turn around required makes volumetry not feasible to do routinely
Thoracic Section Head: This is too much to be handled by the Rad with their workload.
Any kind of volumetry would not be done on a regular basis.
Jen – can we check with other sections (eg. Abdomen)
Mark – will check
Kevin: Just to be clear, I presume this means that operator interaction is required (at least for some cases) by your tools. If interaction is not required, then this is an easy Yes. We may need to tweak the wording to make that clear.
Marthony: yes it is required.
Kevin: How long did it take the radiologist to go through the test set?
Marthony: note that some of the images in the test set are attached and might take quite a few minutes (15?) for a single case.
Rudresh: Our test cases are not "easy" (which makes it a "good" test but takes time)
Greg: What should our "standard" of performance be?
Marthony: Do we want to test on phantoms instead?
Nancy: Phantom won't tell you about their "patient" performance which is harder/more variation. It really is a critical point to characterize their performance "in the real world". And this is a one time test.
Greg: Might be hard for rads without "academic time" to spend on this. There will be resistance.
Kevin: So we need to understand what we are asking for.
Nancy: Might consider reducing some of the really hard cases if they are time consuming?
Ehsan: If a long-time case (that might worsen the repeatability score) is that working for or against our goal of characterizing the "typical performance" or the "worst case performance".
Nick: If they are profile "conformant" lesions, should we drop? Hopefully people would be willing to do it once.
Ritu: If variability, might tease out if it is case issue or operator issue.
Rudresh: Finding margin (when pleural or mediastin. attach.) might be the key point.
Add wording to allow for this to be done by an Image Analyst if that is how it will be done clinically at this site.DoneAdded note above table.
Protocol Design (section 3.4.2)
Shall prepare a protocol to meet the specifications in section 3.4-protocol design.YRNFWKevin: Just want to double check. This is marked as yes/routine, but some of the elements of protocol design were marked as Not Feasible or Feasible Will Not Do.Process of protocol prep to certain reqs and dissemination is Routine/Feasible.OK
Shall ensure technologists have been trained on the (Acquisition) requirements of this profile.NFWNFWKevin: Did you use the checklist for training, or would other materials be useful. Were there concepts that needed additional explanation?
Binsheng - may need to put all the non-protocol parameters items into the standard site procedure. It would be hard to teach them to have exceptions.
Ehsan: The checklists are likely a good enough tool.
So training is feasible. Need to be cautious of specific requirements that are exceptional.
Sites will likely incorporate requirements beyond the protocol parameters into their procedure docs as needed. Should we make that a profile requirement?
Shall set (Total Collimation Width) to Greater than or equal to 16mm.YRYROK
Shall set (IEC Pitch) to Less than 1.5.YRNFNMark: It is conceivable that some protocols may be configured to scan using the Flash mode which has a pitch greater than 1.5 – this would probably only be for chest studies, but realistically some patients with tumors in the chest to be measured may need to be scanned with Flash mode to minimize breathing. A more reasonable metric to assess rather than pitch may be SSP (Slice Sensitivity Profile)
Mark – TG233 recommends and provides tools to check the SSP which assesses z-axis resolution. High pitch with dual detector should still be able to meet the requirements via SSP. The actual value for SSP target should maybe come from the TG233 people (Ehsan and Co)
Nick – be consistent with our slice thickness targets that were chosen related to our tumor sizes and repeatability targets.
Ehsan - Duke will prepare a proposal (SSP requirement and assessment procedure) for consideration in the next edition of the Profile.
Note also there might be a tradeoff of ease & familiarity of pitch vs SSP
NC. After committee discussion, will keep the 1.5 requirement and when the SSP procedure is ready we can consider adding it to the profile and allow additional protocols that are >1.5 but pass the SSP procedure?Done
Shall set (Nominal Tomographic Section Thickness) to Less than or equal to1.5mm.YRYROK
Shall achieve a table speed of at least 4cm per second, if table motion is necessary to cover the required anatomy.YRYROK
Shall prepare a (Reconstruction) protocol to meet the specifications in this table.YFWNFNMark: See IEC Pitch discussionNC. See IEC Pitch resolution. Will investigate for next edition.Done
Shall ensure technologists have been trained on the (Reconstruction) requirements of this profile.NFWNFWOK
Shall set (Reconstructed Image Thickness) to between 1.0mm and 2.5mm (inclusive).YNFNFWMarthony: 0.625 mm ST is standard at our institution
Nick: Simple geometries might be more accurate, but complex geometries (e.g. highly spiculated) could get worse.
Q. how many sites typically want to go below 1.0
Line 101 also proposed 0.5 to 2.0mm
Jen: 2.5 gets the 4 slice scanners, 2.0 for an upper limit does not. And new scanners can reconstruct wide, but old ones cannot reconstruct narrow.
Ehsan: Relax the lower bound. The noise floor catches the lower limits. And it's better for partial vol and don't want to do extra recons unnecessarily.
Rudresh: Consistency is a bit harder.

Shall set slice thickness between 0.5-2.5mm
Shall set ("spacing") to less than or equal to the Reconstructed Image Thickness (i.e. no gap, may have overlap).YRYRKevin: Do you use no gap or some overlap?
Marthony: Duke - no overlap, no gap
Subject Handling (section 3.5)
Shall prescribe a contrast protocol that achieves enhancement consistent with baseline.YFNYRMarthony: N/A typically we do not use contrast in standard chest exams.
Kevin: The profile also states it is applicable to abdominal lesions. Would these requirements as written be adequate for e.g. liver? Do you agree with having one profile that covers lung, liver, lymph, etc in one spec?
Marthony: We can consider changing the specification text (referring to contrast) to say: For protocols involving contrast, shall prescribe contrast protocol to achieve enhancement consistent with baseline. Some protocols do not require contrast agents.
Ehsan: Try to use generic language that is open for liver, lung, etc.
Revised text to clarify that no contrast is an acceptable contrast protocol.Done
Shall determine whether the selected (intravenous) contrast protocol, if any, will achieve sufficient tumor conspicuity.YFNYRMark - Visual assessment, but also with the triggering with values that hit fairly consistent effect.
Marthony: N/A typically we do not use contrast in standard chest exams.
Shall determine whether the selected (oral) contrast protocol, if any, will achieve sufficient tumor conspicuity.YFNYRMarthony: N/A typically we do not use contrast in standard chest exams.OK
Image QA (section 3.8)
Shall confirm the images containing the tumor are free from artifact due to patient motion.YRYRKevin: Do you usually do this during reporting, during measurement, or is measurement done during reporting?
Marthony: Yes.
Kevin: Wise guy. :-) Please choose A, B or C.
Marthony: Yes to A & B
Shall confirm the images containing the tumor are free from artifact due to dense objects, materials or anatomic positioning.YRYROK
Shall confirm that there are no clinical conditions affecting the measurability of the tumor.YRYFWOK
Shall confirm (now or during measurement) that tumor longest in-plane diameter is between 10 mm and 100 mm.
(For a spherical tumor this would roughly correspond to a volume between 0.5 cm3 and 524 cm3.)
NFWYFWMarthony: Please indicate why it is limited to this size range.
Kevin: Please refer to the Discussion part of section 3.8 of the profile and let us know if further clarification would be useful.
Marthony: It would be helpful to reference section 3.8 in the specification here.
Marthony: The Rider test set don't really test that lower bound. Some are above the upper bound.
Andy: Upper bound is dominated by the attachment requirements. Typically 500mm that doesn't have issues should work fine so maybe lets raise.
Lower bound - RIDER Dataset isn't the "definitive" dataset. Lower end not well sampled. Adding data to be more represetative is a goal. Let's keep 10mm since it is implicit in the thinking, but we should get more test data.
Greg: And we have another profile for below 10.
Jim: High end will correlate with lots of metastatic disease. Not as important to distinguish "size of canonball" (not a clinical pivot). Agree to get more data.
Nick: Upper end brings more complex morphology and lack data so go with Jim's comment and get more data on lower end.
Will leave where it is for now.
Plan on more lower end data
Shall confirm the tumor margins are sufficiently conspicuous and unattached to other structures of equal density to distinguish the volume of the tumor.NFWYFWOK
Shall confirm that the phase of enhancement and degree of enhancement of appropriate reference structures (vascular or tissue) are consistent with baseline.NFNYFWIs this basically done visually since contrast details of prior are not recorded in the image header?
Kevin: If both baseline and TP2 are non-contrast, that would conform to this requirement. Does that resolve the "not going to do"?
Marthony: Yes.
Added discussion text.Done
Shall disqualify any tumor they feel might reasonably degrade the consistency and accuracy of the measurement.
Conversely, if artifacts or attachments are present but the radiologist is confident and prepared to edit the contour to eliminate the impact, then the tumor need not be judged non-conformant to the Profile.
NFWYFWShall disqualify a tumor if it has features or characteristics on the current scan that would be expected to degrade the consistency or quality of the measurement.
Shall disqualify a tumor if comparison to the prior scan shows changes that would be expected to degrade the consistency or quality of the measurement.
(has features or characteristics on the current scan that)
Current text was found to be OK, but examples of images that should be disqualified would help.
ToSomeday - Get example images (Duke? MGH? Columbia? Brigham?) Note, this would be cases where the result less reliable, not necessarily cases where quantification is abandoned completely.

A couple of the RIDER data might be "bad".
Andy and Jen reached out to Mike M-G and got a list in Dec '17 of which RIDER data ought to be disqualified for these reasons.
TODORemoved them from the 4.4 procedure dataset, that brings us down to 14 small and 6 large lesions for a total of 20 (down from 31)

TODO get clarifications on disqualification reasons
Shall confirm that the tumor is similar in both timepoints in terms of all the above parameters.NFWY?Marthony: Protocol consistency can be confirmed. Consistent use of a single imaging system cannot be guaranteed especially in the case of referrals from other institutions
Kevin: None of the above parameters are particularly imaging system or protocol specific. Will need to clarify the requirement.
Remove the requirement. All the above parameters already mention consistency with baseline as appropriate.Done
Image Analysis (section 3.9)
Shall re-process the first time point if it was processed by a different Image Analysis Tool or Radiologist.YFWYFWOK
Shall review & approve margin contours produced by the tool.NFNYFWMarthony: This is not typically done at this institution. However, this may be requested in special cases.
Kevin: Good feedback. This requirement was put in as a bit of a QA check on the contours. Is it the approval part that is the obstacle? Or does the radiologist use the volume number without looking at the contour? Or are they involved in the contouring so the review was really done then?
Marthony: The radiologist is typically involved in contouring, and tumor marking.
Kevin: In that case they would effectively review/approve the contour when they accept it and take the measurement.
How does it work when an analyst does the contour? Does the rad take the number without looking at the contour? If so, who is responsible if the contour was bad?
Marthony: I would need to follow-up on this question
Ritu - Radiologist should always signoff even if it's done by an image analyst or a "good" auto-segmenter.
Eric - agree. Rad always has last say/approval.
Marthony - The rad will review the contour if they were not involved in originally generating it.
Physicist Checklist
Periodic QA (section 3.3)Marthony: Proposed text: "Protocol design should be done collaboratively between the physicist and the radiologist with the ultimate responsibility to the radiologist. Some technical specifications are system dependent and may require special attention from a physicist. All protocols should be validated by the physicist."
Kevin: Good text. I believe this was the committee intention. We can adopt a variant of this language into the profile and thus into the checklist.
Added this to the discussion section of the profileDone
Shall perform relevant quality control procedures as recommended by the manufacturer.
Shall record the date/time of QC procedures for auditing.
Protocol Design (section 3.4.2)
Shall validate that the protocol achieves an f50 value that is between 0.3 mm-1 and 0.75 mm-1.
See section 4.1. Assessment Procedure: In-plane Spatial Resolution
YRNFWYMarthony: ACR phantom was scanned on the GE 750 HD scanner using the prescribed protocols. The following results were obtained: f50 -> 0.45 mm-1
Kevin: The 4.1 Procedure called for using the soft-tissue insert edge but it looks like the measurement might have been taken on the bone insert edge. Is that something we should change in the procedure?
Marthony: You are correct, but the MTF between the four contrast inserts changes only slightly.
Kevin: So should we modify the requirement to allow the assessor to choose any insert or should we keep the requirement (which in the real world would require re-computing with the other insert)?
Marthony: Well, ideally, the assessor could use all inserts for measurement (as seen in image provided). In this case, f50 for soft-tissue insert is 0.4 mm-1.
Revise to between 0.3 mm-1 and 0.5 mm-1 to be able to drop the consistency requirement.

Consensus from Mike/Ehsan/Kirsten/Marthony seemed to be that we should evaluate both a soft-tissue insert edge and an air "insert" edge against the 0.3 to 0.5 range.

Revised requirement and assessment procedure accordingly.
Shall validate that the protocol achieves:
• a standard deviation that is < 60HU.
See section 4.2. Assessment Procedure: Voxel Noise
YRNFWYMarthony: standard deviation -> 35 HU
Kevin: Including the Appendix B seems very helpful. Would it be OK to make it a required part of a conformance record?
The procedure the physicists put together for Voxel Noise called for a circular ROI at the phantom center. Appendix B shows an off-center circular ROI and a centered square ROI so it wasn't clear which was used. Should we relax the requirement? I could maybe see a slight advantage to the circular ROI if there are radial characteristics to the noise but I don't know if it's worth dwelling on. The size looks right though.
Marthony: 1. I would recommend including appendix B in all reports.
2. The centered square ROI was used for noise measures. Noise ROI shape is not significant according to my understanding.
Kevin: Should we modify the requirement to allow any shape?
Marthony: An appropriately sized and located square or circular ROI would be sufficient.
Removed shape requirement, kept size.
A more formal record would be nice but it's a burden so we can encourage but leave as optional.

Add some text using standard terms (record) to flag what kind of things to put in the record.
Note: if you are right on the edge, the details are more meaningful.
Technologist Checklist
Subject Handling (section 3.5)
Shall use the prescribed intravenous contrast parameters.?FNY?Marthony: N/A for Chest
Kevin: What would the answers to this and the following be for a Liver lesion study?
Marthony: Contrast might be used in the case of liver scans. It would certainly change the response here.
Revised text to clarify that no contrast is an acceptable contrast protocol.Done
Shall document the total volume of contrast administered, the concentration, the injection rate, and whether a saline flush was used.?FNNNFMark: DICOM field in software version available for documenting saline flush
Kevin: Saline Flush will go in future Contrast object, but would need a CP to add a tag for this if wanted today.
Note: the header will likely contain the protocol value not the "actual".
This goes with earlier contrast discussion. How does it affect volumetry consistency?
Jen – concentration and total are typically recorded
How is this all captured in C Trials.
Rudresh – assess DICOM contrast tags for presence and visually assess adequacy and consistency between timepoints and whether the overall image quality is acceptable.
Marthony: N/A for Chest
Contrast is present in arterial phase
Nick investigated low-contrast cases but did not investigate how variation between baseline and T2 might affect volumetry.
Jen: Have to provide it to have credibility for results in the liver. (Greg: Agree)
Certainly captured in the reports so not too hard to comply?
Jen: Does Duke insertion project address varying contrast enhancement? Marthony: not currently but conceptually doable
Jen: Delay is the MOST important.
Jen: Propose - total volume, agent, concentration. Don't require rate and saline, but recommend. And depend on Rad QA to catch variance.
(Will add "If used" text). See also 76 and 85
Updated the QA text to disqualify (Jen and Kavita and Rudresh)
Shall use the prescribed oral contrast parameters.?FNYRMarthony: N/A for ChestRevised text to clarify that no contrast is an acceptable contrast protocol.Done
Shall document the total volume of (oral) contrast administered and the type of contrast.?FNNNFMarthony: N/A for Chest
Mark: Again, not sure where this could be documented similar to other field in 0018,104* series
Kevin: is this a factor for any of the tumors that are targetted by this profile?
Greg: Lymphoma, mesoteric lymph adonopathy,
Jen: But bowel loops are going to be variable. But keep it in for "credibility". Would raise questions if absent.
Eric: could point to it as desirable and recognize the issue in the informative discussion but not make it a requirement.
See also line 74DoneMove to discussion.
Shall position the subject consistent with baseline. If baseline positioning is unknown, position the subject Supine if possible, with devices such as positioning wedges placed as described in section 3.5.1.Y
YRKevin: What method do you use during the followup scan to determine baseline positioning? Should that be added to the profile?
Marthony: We just use a standard alignment procedure.
Kevin: Certainly works for local followup.
If the baseline scan was at another institution and based on looking at the images the positioning was different, would you follow the baseline, or your local standard procedure?
Marthony: We just scan them according to our local protocol independent of the original scan.
Q. So do we fail Duke when they scan Supine and the prior was prone? If not then eliminate the requirement and adjust the performance accordingly, or yes that would disqualify that measurement.DoneRemove and catch with Rad QA.
Shall remove or position potential sources of artifacts (specifically including breast shields, metal-containing clothing, EKG leads and other metal equipment) such that they will not degrade the reconstructed CT volumes.YFWYRMarthony: We use the breast shield, but we found that it does not cause artifacts
Kevin: Good feedback. The requirement allows for positioning "such that it will not degrade the reconstructed CT volumes". Would any clarification/modification help?
Marthony: Yes it would help.
Add clarifyDone
Shall adjust the table height for the mid-axillary plane to pass through the isocenter.YRYROK
Shall position the patient such that the “sagittal laser line” lies along the sternum (e.g. from the suprasternal notch to the xiphoid process).YRYROK
Shall instruct the subject in proper breath-hold and start image acquisition shortly after full inspiration, taking into account the lag time between full inspiration and diaphragmatic relaxation.YRYRKevin: Can't really check this requirement. "Do your best" but don't know if they didn't achieve it.
If they do achieve it both times, consitency is achieved, if missed one or other time, that's a disqualifier, so consistency element is not needed.
Radiologist QA might check on this? don't need consistency requirement for this on tech.
Targetting a motion free lung.
It's good instructions even if we can't check it. No change.OK
Shall ensure that for each tumor the breath hold state is consistent with baseline.NNFYRMarthony: Not sure there is a consistent way to ensure/verify this
Kevin: Good point. Worth discussion in the committee.
Shift to QA check for the radiologistDone(Small nodule has some carefully worded text from David Gierarda) Provided by James Mulshine. Now woven into existing text.
Shall record factors that adversely influence subject positioning or limit their ability to cooperate (e.g., breath hold, remaining motionless, agitation in subjects with decreased levels of consciousness, subjects with chronic pain syndromes, etc.). NFWNFWKevin: Any problem finding a place in the scanner GUI to do this?
Marthony: I do not think this is a problem, but it is something that should be worked up in conjunction with the medical institutions.The actual factors are inconsequential
Removed here since it's in Acquisition.Done
Shall ensure that the time-interval between the administration of intravenous contrast (or the detection of bolus arrival) and the start of the image acquisition is consistent with baseline (i.e. obtained in the same phase; arterial, venous, or delayed).NFNNFWMark: Implementation of DICOM Protocol SOP class would make this much easier
Marthony: N/A
Propose dropping requirement. Rad will pick up these issues in QA with or without notes communicated from tech. (Add specifically to QA if it isn't there)Done
Shall ensure that the time-interval between the administration of oral contrast and the start of the image acquisition is consistent with baseline. (Note that the tolerances for oral timing are larger than for intravenous).NFNNFNMark: Drink times for oral contrast aren’t recorded anywhere
Jen – bundle with the IV contrast discussion.
Likely also a subjective assessment by rad.
Marthony: N/A
Transit time is not consistent for a given patient so same timing still isn't really going to hit the same mark.
See 74DoneMake informative.
Image Data Acquisition (section 3.6)
Shall select a protocol that has been previously prepared and validated for this purpose (See section 3.4.2 "Protocol Design Specification").NFWYRKevin: How do you communicate to the techs which protocol(s) are the validated ones?
Marthony: Standard protocols are established. They are used based on the request of the overseeing physician.
Kevin: Does that mean you would validate all chest protocols against this profile?
Marthony: A specific chest/liver etc. protocol can be designed and validated based on this one.
Added text to highlight this in Discussion. Requirement is fine.Done
Shall report if any parameters are modified beyond the specifications in section 3.4.2 "Protocol Design Specification".NFWNNFMark: Only one field available for comments, already using for comments on patient position or difficulties – no way to record in header if change in AEC parameters from baseline
Mark: New Performed Protocols would help but nowhere to put it now.
Jen – note this is only a problem in brain/head.

<The following comment was made at some point but I can't attribute a source we can check with. If anyone wants to re-raise this point they can. For now the decision is to remove this requirement.
"Require that they log if any change was made. Any change is considered to be non-conformant until someone goes and validates the variant protocol (see Protocol Design). Note also that if you see many modifications, that is a useful quality metric. Premise here is that techs seldom if ever modify the protocol to the particulars of the patient or the exam. ">
Really the question is whether they've changed the protocol enough to fail the noise and resolution tests. They likely won't adjust slice thickness, pitch or table speed. So we can drop this requirement too. They have to use validated protocol but don't have to record changes.
(Side note - the AEC parameters that the scanner uses to tweak the parameters are in line with our assessment metrics so that is a "safer" tweak.
DoneTODO Ehsan will prepare a note to the Protocol Design discussion and/or the Noise and Resolution Assessment Procedures highlighting the limitation (for future consideration) that we should be doing our protocol assessment procedures to consider the range of patient body habitus.

Shall set (scan plane) Consistent with baseline.YRNNFMark: If brain study which may use gantry tilt on some scanners and follow up is on a different scanner without gantry tilt not possible to comply. Reconstruction of images in same plane is routine, but not scanning
Discussion: Likely might only happen with Head/neck. Physics Q: If it is different, does it affect quantitation?
Note patient positioning differences of head/neck is probably bigger than the differences in gantry tilt. Primary neck tumors, head-neck carcinomas are the main place where this matters at all.
Change protocol requirement to No tilt? Or require Rad to confirm during QA whether tilt/anatomical slice orientation was consistent for baseline and followup.
Move to Rad QA and add informative text about what to look for and the nature of the impact. Informative text that 0-tilt is expected for most imaging except head/neck.
"any difference in tilt/orientation is unlikely to cause differences in quantification"
Shall set (kvp) Consistent with baseline (i.e. the same kVp setting if available, otherwise as similar as possible).YRNFWKevin: How do you determine the baseline value? Someone checks before scantime and provides to tech in comments ? Tech accesses the prior scan header to look it up?
Marthony: We use standard protocols. So expect kVp should be consistent.
Kevin: So there is only one protocol for Lung? Or if multiple, all parameters in 3.4.2 required to be consistent with baseline are the same between the protocols? That would work for local followups. How do/would you handle outside baselines?
Marthony: I will have to follow up on this.
Claudia - If prior scans are very poor, you might establish a new baseline.
Matt - in clinical environments, there will be a lot of protocol variability, but scanner might produce consistent (or non-consistent). kVp might vary +-20 safely in density. Volume should be insensitive to greater changes.
Kevin: Can we assume this is covered by the Noise/Resolution specs and drop kVp req? i.e. kVp variation can't degrade performance without impacting Noise/Resolution.
Jen: Based on Dr. Samei's input, we should keep this in.
Jen: for baseline at the same institution, some consistency likely due to consistent practice. When the baseline done at different institution (e.g. in ER and followup 6 months later) maybe not
Jen: 80-140 range in steps (which might not be exactly equivalent). One step is not significant. 3 steps perhaps is.
Nick/Marthony - haven't seen significant effects over a wide range of kVp.
Jen: is there a bigger effect when contrast is in use?
Uma: even 80-120 might be a significant effect.
Jen: newest scanners can use 80kVp typically with a higher mAs but followup on different scanner might be a bigger effect.
Rick: noise limit protects from issues with a significant drop in kvp. What about autokVp?
Jen/Matt: Likely consistency of patient weight would result in consistent setting.
Jen: Expect sites will be OK with the tech Q/R the baseline study from the scan console and examine the header to find the kvp value. For typical patient, rare to scan chests at 80 or 140 so likely only one or two step difference.

Mark: May be difficult to enforce in systems with automatic tube current selection
Jen – presumably if the AEC parameters are consistent and the patient is consistent then likely the resulting parameters would be largely consistent.
Andy – Processwise – would like to drive to conclusion on all these feedbacks/assessments.
Kevin – working on format to facilitate that. (This spreadsheet)
drop kVp consistency requirement

The literature has not demonstrated apparent changes in volumetry from changes in kVp.
Dose concerns push down kVp. Iterative recon is used to suppress noise. No clear risk of quality problems.
kVp also affects contrast (might show up in our f50 spec but maybe not)
And dual energy highlights the improved discrimination at different energies.
Differences are driven even within a site because of different device features. Even then, the "same" kVp is not really the same between devices (different spectrums, filtration, etc.).
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4950443/ is a Chinese paper looking at volumetry that found no impact of kVp but there was an impact for mA. (it says that mA effect is due to noise which we are already controlling for)"
Shall confirm on the scanogram the absence of artifact sources that could affect the planned volume acquisitions.YRYRKevin: What things do you have your techs routinely check for? Is that something we should list in the profile?
Marthony: positioning, desired coverage area is included in image.
Kevin: And absence of artifact sources?
Marthony: Techs do check for the presence of artifacts.
Shall achieve a table speed of at least 4cm per second, if table motion is necessary to cover the required anatomy.YRYROK
Shall ensure the tumors to be measured and additional required anatomic regions are fully covered.YRYRKevin: Do we need to keep this requirement? Violating it means the tumor is not covered so there is no measurement so it's self-disqualifying anyway. It would be true even if we don't say it.Removed.Done
Shall, if multiple breath-holds are required, obtain image sets with sufficient overlap to avoid gaps within the required anatomic region(s), and shall ensure that each tumor lies wholly within a single breath-hold.YRNNFMark: Assuming tumors can’t be seen on localizer radiograph how would tech know whether or not part might be in one breath hold scan range vs the next. Only option would be to repeat scan if tech thinks tumor is half in scan and can’t imagine asking tech to first identify a tumor on scan and then repeat scan if not fully included in original volume.Removed. Will catch in QA. Can't reasonably expect the tech to do anything at this stage that would help.Done
Shall enter on the console any factors that adversely influenced subject positioning or limited their ability to cooperate (e.g., breath hold, remaining motionless, agitation in subjects with decreased levels of consciousness, subjects with chronic pain syndromes, etc.).NFWNFWKevin: Committee should consider dropping this requirement from Subject Handling since we have it here.Proposed dropping requirement.
Is it realistic to expect and require that the Rad will pick up these issues in QA with or without notes communicated from tech.
Curiosity about whether repeat might be successful or better process in future, but any issues that would degrade volumetry would be visible to the radiologist.
Donemention these issues in the Rad QC. Remove the requirement.
Shall set (FOV) Consistent with baseline.YFWNFNQ. Why not feasible? Tech can’t easily find out what the baseline FOV was?
Mark: If a dual source scanner is used for one of the scans, but not others and either a flash or DE protocol is used, it may not be possible to match FOV. Also, yes, asking the tech to check the original DFOV and match it just doesn’t seem feasible from a workflow perspective.
Consider dropping the requirement.
If the main motivation for the FOV requirement is to maintain rough consistency of inplane pixel size, any FOV difference between scans big enough to impact volumetry would manifest in the MTF metric.
DoneMove requirement to QA
Image Data Reconstruction (section 3.7)
Shall select a protocol that has been previously prepared and validated for this purpose (See section 3.4.2 "Protocol Design Specification").
Shall report if any parameters are modified beyond those specifications.
YFWNNFKevin: Which protocol did you use for the measurements in Appendix A?
Marthony: We selected one of our standard chest protocols that complied with 3.4.2
Mark: No place to indicate where parameters are modified
Add discussion text explaining that we don't care what mechanism you use to make this information available to the radiologist during the QA activity (so the tumor can be disqualified) and also so that later radiologist knows if the baseline is non-conformant. Just like we don't care what reporting system you use.Done
Shall either
• select the same protocol as used for the baseline scan, or
• select a protocol with a recorded f50 value within 0.2 mm-1 of the f50 value recorded for the baseline scan protocol.
See section 3.4.2 for further details.
YFWNFWMarthony: The technical expectations can only be met if done in consultation with a physicist
Kevin: Can you elaborate? The expectation is that at Acquisition/ Reconstruction time the tech has one or more validated protocols to select from. Any validation would have been performed at Protocol Design time.
Marthony: You are correct. The language used for the second bullet point is too technical for most Technologists. In fact, that is physicist language.
Kevin: Good point. We should see if we can find a way to reword without losing the similarity requirement.
Mark: I think a large number of the questions in the Technologist checklist section can’t be completed by a technologist. As an example, I don’t know any tech who would be able to understand if the MTF at 50% value of the kernel being used is within 0.2 mm^-1 of the baseline kernel. And knowing that would probably require the manufacturers to provide sample MTF curves for all their kernels as I don’t think it is realistic to expect the MTF of every kernel to be measured by a site. There were several other questions in the technologist section that I think require the input of a clinical physicist as well.
After group discussion (10/2):
A: Set acceptability range for f50 to 0.3-0.5 and drop the consistency (See line 105)

Also considered
B: record the values on a sheet for the tech to check consistency
C: record the protocols for the Analyst or Radiologist to check consistency during QA
Shall either
• select the same protocol as used for the baseline scan, or
• select a protocol with a recorded standard deviation within 5HU of the standard deviation recorded for the baseline scan protocol.
See section 3.4.2 for further details.
YFWNNFMarthony: The technical expectations can only be met if done in consultation with a physicist
Mark: Impossible to assess with AEC and iterative recon. Protocols won’t be configured to deliver same STD of noise in phantom – they’ll be set to deliver same image quality in patient and no guarantee that matches with 5 HU difference of STD of noise in phantom.
With the same AEC parameters, how much variation for a single patient (not phantom) could you expect? Then how much variation between scanner makers?
Or can we safely assume that the variations are likely to be captured in the phantom tests and the residual variation for patients is unlikely to be significant?
Could do an aorta noise measurement of noise and check it during QA and disqualify if too high?
(On the surface this is the same as the issue above but there seems to be a deeper issue)
Do we have to give up on this and estimate the degradation to our claim performance in the worst case?
Shall select a QIBA-conformant protocol available on the scanner.
See section 3.4.2 for further details.

= remove since this is covered in the first row of the table.