Master’s Thesis Formatting (Series) 02: Auditable Presentation of Research Design — The Layout Logic of Methods, Data, and Ethics/Declarations

Master’s Thesis Formatting (Series) 02: Auditable Presentation of Research Design — The Layout Logic of Methods, Data, and Ethics/Declarations • Guide
FreeFormat
Go to toolNo template yet; pick another template in the tool

At a glance

Use this page to apply “Master’s Thesis Formatting (Series) 02: Auditable Presentation of Research Design — The Layout Logic of Methods, Data, and Ethics/Declarations” formatting to Word (.docx): format with the template, or upload a document to run a format check.

How to format / how to check formatting

  1. 1Click “Format using this template” to open the tool with this template pre-selected.
  2. 2Upload your .docx: you can run “Check formatting” first to get a score and issues, then format.
  3. 3Download the result and verify key rules using the sections and self-check notes below.

Key notes & self-check

Quick actions (check/format)

  • Open Studio: /en/studio (upload .docx → auto-check → format → download)
  • Browse templates: /en/guides

Self-check checklist

  • TOC updates correctly (headings use styles)
  • Page numbers start at the right place (sections & restart)
  • Heading hierarchy is consistent (avoid manual bold/size)
  • Captions are consistent (numbering & references)

Related templates & guides

1. Why master’s theses emphasize “auditability” more

Undergraduate theses are often judged by “completeness”. A master’s thesis is closer to an inspectable research output. Reviewers’ core questions are:

  • Are your conclusions produced by a clear and auditable research process?
  • If someone repeats what you described, can they obtain comparable results?

So formatting requirements for Methods/Data/Ethics are not formalism — they upgrade your research from “a narrative” to an auditable object.

---

2. The reader tasks served by Methods, Data, and Ethics/Declarations

Think of them as three “review checklists”:

2.1 Methods: let readers understand “what you did”

Readers want to quickly confirm: research question → variables/concepts → sample/materials → procedures/tools → analysis path.

2.2 Data: let readers judge “evidence strength and boundaries”

Readers want: data source, collection/cleaning, missingness/bias, availability, reproducibility conditions (or reasons it can’t be shared).

2.3 Ethics/Declarations: let the study “pass process gates”

Master’s theses often involve surveys/interviews/platform data/user-generated content. Schools and journals care about compliance: consent, anonymization, conflicts of interest.

---

3. The key shift: from “clear writing” to structured presentation

The series emphasizes workflow and visual logic. The core principle is:

Auditability is not something readers dig out of long prose — it happens when you put structured information in front of them.

This directly leads to three format strategies:

  1. Stable location: Methods/Data/Ethics should appear where readers expect them
  2. Stable hierarchy: each part uses a consistent sub-section structure to reduce search cost
  3. Stable carriers: use tables/checklists/flow diagrams to hold “review-type information”, not scattered paragraphs

---

4. Methods section logic: make the research process checkable

4.1 Methods is a traceable pipeline, not a single paragraph

Stabilize the internal structure (we don’t teach content writing, only structure and presentation):

  • Study design type (experiment / quasi-experiment / survey / case study / text mining, etc.)
  • Population & sample (inclusion/exclusion rules, sample size, source)
  • Measures & variables (definitions, scales/indices, coding rules)
  • Procedure & materials (steps, stimuli/questionnaire structure, tool versions)
  • Analysis plan (models/tests/software/thresholds/robustness checks)

4.2 Visual hierarchy: reviewers should see “coverage” at a glance

Two “review-type carriers” often beat long prose:

  • a process box: a short steps list or flow chart (collect → clean → model → test)
  • a variables/measures table: variable, definition, scale source, items, scoring, reliability/validity (if applicable)

Reason: reviewers should not have to mine long paragraphs for key fields.

---

5. Data section logic: make evidence bounded, reproducible, and interpretable

5.1 Minimum auditable fields for data description

The common failure is not “missing data section”, but writing it like an essay without key fields. A minimum auditable description lets readers answer:

  • where the data comes from (institution/platform/database/self-collected)
  • the time range and scope
  • sample composition and selection rules
  • what cleaning and missing-data handling you did
  • whether data can be shared; if not, why, and what alternatives exist (aggregates, synthetic data, code)

5.2 Data is best presented as a table

Use a “data card table” (structure only):

  • source and acquisition method
  • sample size and composition (raw total, exclusions, final sample)
  • key fields/variables list
  • cleaning rules summary
  • availability statement (public/restricted/not shareable + reason)

The table turns “data credibility” from a textual impression into checkable facts.

---

6. Ethics/Declarations logic: it’s a process gate, not a one-line slogan

6.1 Why master’s theses more often require ethics statements

They more often touch human-related data: surveys, interviews, classroom experiments, platform comments, corporate datasets. Even without formal IRB, advisors and schools still care about baseline compliance.

6.2 What an ethics statement should contain (fields, not phrasing templates)

Cover three categories:

  • Participants & consent: informed consent, withdrawal, compensation
  • Privacy & anonymization: de-identification, storage, access control
  • Risk & compliance: sensitive populations/data, approvals (if any), conflicts (if applicable)

Treat it as a field checklist, not a slogan.

---

7. Common failure modes (reviewers immediately lose trust)

  1. Methods prose pile-up: key fields scattered across chapters; no complete pipeline
  2. Measures not systematized: variable names change across sections; mapping fails
  3. Missing source/cleaning: “collected X questionnaires” without filters/missing-data handling
  4. No explanation for non-shareable data: auditability is unclear
  5. Hollow ethics statement: “we complied with ethics” without checkable fields
  6. Tool versions missing: software/library versions absent, reproducibility becomes uncontrolled

These failures make your research non-auditable.

---

8. Translate the “why” into acceptance criteria (structure + auditable fields only)

8.1 Methods acceptance criteria

  • stable sub-section structure exists (design / sample / measures / procedure / analysis)
  • a process summary exists (list or flow chart)
  • variables/measures are mappable (consistent naming, definitions locatable)

8.2 Data acceptance criteria

  • minimum fields exist: source, scope, selection, cleaning, missing-data handling
  • sample composition/exclusion logic is visible (often as a table)
  • availability statement exists (public/restricted/not shareable + reason)

8.3 Ethics/Declarations acceptance criteria

  • consent, privacy, risk/compliance fields are covered
  • if there is no formal approval, a reason and alternative protections are stated
  • if platform/secondary data is used, compliance boundaries and anonymization are described
We validate the “auditable structure”, not detailed citation style rules or exact typography numbers.