Applying

How to Write a Grant Application That Actually Gets Funded

Most grant applications are rejected not because the project was wrong, but because the application didn't answer the assessor's actual questions. Government assessors work from a scoring rubric — they award marks line by line, not based on overall impression. This guide covers how to write an application that scores well at every stage, from eligibility screening through to merit assessment.

Last updated: 8 March 2026

Why most applications fail before an assessor even reads them

Government grant programs receive far more applications than they can fund. A mid-tier Commonwealth program like the Industry Growth Program receives hundreds of applications per round. Before any merit assessment begins, every application goes through an eligibility screening stage — and a significant portion are eliminated there, without a single merit mark being awarded.

The eligibility screen is not generous. If you haven't met a requirement — wrong state, business too young, project already started, wrong industry classification, missing a required attachment — your application is declined without a score. No one reads your project description. No one considers whether your idea is good. The application fails a checkbox and you receive a standard rejection email.

This matters because it reframes the first job of writing a grant application: before you worry about being persuasive, you need to be compliant. Read every eligibility criterion in the program guidelines. Check every one against your business. Only once you've confirmed you meet all of them should you begin writing.

The second filter — before merit assessment — is incomplete submissions. Forgetting an attachment, submitting financial statements in the wrong format, or failing to answer a mandatory question are automatic disqualifiers in most programs. The Industry Growth Program guidelines explicitly state that applications missing required financial documentation will not proceed to assessment. Treat compliance as your foundation, not an afterthought.

How grant assessors work — and how your application is actually scored

Most people imagine grant assessment as a thoughtful reader forming a holistic impression of their application. The reality is more structured and less forgiving.

Government grant assessors — whether internal departmental staff or contracted external reviewers — work from a scoring rubric. Each assessment criterion in the program guidelines corresponds to a set of sub-questions, and each sub-question has a maximum score. The assessor reads your response, compares it to a scoring guide that describes what a 1-out-of-5 answer looks like versus a 5-out-of-5, and assigns a number. Scores are tallied. Applications are ranked. Those above the funding threshold receive offers.

What this means in practice: you don't get credit for things you didn't say. If the assessment criterion asks "How will this project create jobs in regional NSW?" and you describe the project thoroughly but don't explicitly discuss job creation, you score zero on that criterion — even if the jobs will clearly be created. Assessors score what's on the page, not what could reasonably be inferred.

Some programs publish their assessment weightings, which is invaluable. The Entrepreneurs' Programme weighted "business impact" at 60% and "business capability" at 40% in recent rounds. ARENA innovation grants have weighted "knowledge value" and "market potential" roughly equally. Where weightings are published, allocate your word count proportionally — don't spend 70% of your application on the section worth 30% of the marks.

Scoring rubrics also reward specificity explicitly. A rubric description for a top-mark answer typically uses language like "clearly articulates measurable outcomes," "provides credible evidence to support claims," and "demonstrates a detailed and realistic understanding of." The bottom-mark description says "vague," "lacks supporting detail," and "assertions are unsubstantiated." A specific, evidence-backed answer will outscore an eloquent but general one every time.

Read the guidelines as your outline, not just your rulebook

Most applicants read the grant guidelines once — to check eligibility — then write their application in their own structure. This is backwards, and it's the single most correctable reason for mid-tier applications to score below the funding threshold.

The correct approach: read the guidelines twice. First for eligibility. Second to build your document structure. Every assessment criterion is a section of your application. Every sub-question within each criterion is a paragraph within that section. The guidelines are telling you exactly what to write.

Take the Commonwealth Business Research and Innovation Initiative (BRII). Its assessment criteria include: "strength and feasibility of the proposed solution," "commercialisation potential," "value for money," and "capability and capacity of the applicant team." Each of these is a heading in a well-structured application. Under "commercialisation potential," the sub-criteria ask for market size, the competitive advantage of your solution, and your go-to-market strategy. Those three points become three paragraphs.

This approach ensures you actually answer every question — the most common reason for a low score is a question simply not addressed. It also signals to the assessor that you understand the program's objectives. An application structured around the assessment criteria reads as intentional and professional. One that follows the applicant's own narrative structure forces the assessor to hunt for answers, which creates friction and lower scores.

Before you start writing, copy out every assessment criterion and sub-question from the guidelines. Annotate each one with what you'll say and what evidence you'll use to support it. Only then open a blank document. This preparation — typically an hour or two — prevents the most expensive mistakes.

Lead with outcomes, not background

The instinct when writing any business document is to introduce yourself first: who you are, what your business does, how long you've been operating. In grant applications, this instinct costs marks.

Assessors have read dozens of applications before yours. They don't need to be warmed up to your founding story. What they need to understand, immediately, is what this project will achieve, why it matters to the program's objectives, and whether you can deliver it. Background information belongs in a clearly labelled "About the Applicant" section — after you've already answered the substantive questions.

The opening sentence of any substantive answer should state the outcome: "This project will develop and commercialise a remote monitoring platform for agricultural water infrastructure, creating 8 full-time jobs and enabling $2.4M in recurring export revenue within 3 years." That sentence gives the assessor what they need in 30 words. Everything after it provides the evidence and detail.

A useful editing test: take the first paragraph of each application section and ask whether it could be deleted without losing any substantive information. If the answer is yes — if it's purely introductory — delete it and start with what was previously paragraph two. In most first drafts, this edit alone materially improves clarity and the impression it creates.

The structure of every high-scoring answer: Point, Evidence, Impact

Grant applications that score well consistently use the same three-part structure for every answer, regardless of the question. Once you understand this pattern, you can apply it to any criterion.

Point — state your claim directly at the start of the answer. Don't build up to it. "This project will create 12 full-time positions in regional NSW within 24 months" is a Point. "Our business has been growing steadily and we believe this project will contribute positively to employment outcomes" is not.

Evidence — immediately support the claim with specific, verifiable detail. For a jobs claim, evidence includes: your current headcount, the specific roles you plan to hire (job titles, functions, approximate salaries), the timeline for each hire, and the causal link between those hires and the funded project activities. Evidence answers the implicit question "how do you know?" When an assessor sees a specific claim backed by evidence, they can award full marks. When they see a claim without evidence, they can't — regardless of how plausible the claim is.

Impact — connect the outcome to the program's goals, not just your business's goals. Every grant exists to advance a government policy objective: job creation, export growth, clean energy transition, regional development, innovation diffusion. Creating 12 jobs is good for your business. Creating 12 jobs in a regional LGA with 9.1% unemployment directly advances a regional development program's stated objective. The assessor is there to fund that objective. Show them you understand the difference.

Applying Point-Evidence-Impact consistently across every criterion creates a dense, navigable application. Assessors can follow the logic. Nothing is left to inference. Credit can be awarded where it's due.

Be specific — vague answers are unmarked answers

Vague language is the most common reason grant applications score below the funding threshold. It appears in every section: project descriptions, budget justifications, capability statements, and outcome projections.

The test is simple: could this sentence appear in any business's application? "We are committed to innovation and are well-positioned for growth" could be the mission statement of every company that has ever applied for any grant. It carries no information. An assessor reading it has nothing to score and assigns low marks.

Contrast that with: "We will engage two full-time software engineers to develop a machine learning model that analyses continuous sensor data from agricultural irrigation equipment to predict failure 14 days in advance. The model reduces unplanned equipment downtime for our clients from an average of 3.2 incidents per year to an estimated 0.8, based on pilot testing with two clients over 6 months. This is our first predictive capability — distinct from the reactive diagnostic tools that represent our current product suite." This is specific. It describes what will be built, who builds it, the technical approach, the quantified outcome, the evidence base for the projection, and how it differs from existing capability.

The most commonly vague areas, and how to fix each:

Job creation: not "we expect to create several jobs" but "we will hire 3 full-time engineers and 1 project manager over 18 months, representing a 40% increase in our technical headcount."

Revenue projections: not "we expect strong growth" but "we project $1.1M in new revenue from this product in Year 1, based on a signed heads of agreement with Client A ($600K) and a pipeline of 6 qualified prospects currently in commercial negotiation."

Project activities: not "we will conduct R&D" but "we will conduct technical development across three phases: algorithm development and internal testing (months 1–8), prototype integration with Client A's infrastructure (months 9–14), and commercial pilot with three clients (months 15–18)."

How to write a project budget that assessors trust

The budget section is where many otherwise strong applications lose marks — or generate concern that lowers the overall score. A credible, detailed budget signals that the project is genuinely planned. A vague or misaligned budget signals the opposite.

The most important principle: your budget must reflect only eligible expenditure as defined in the program guidelines. Eligible categories vary by program, but common eligible items include: salaries and on-costs for staff directly engaged on the project, contractor fees, materials consumed in project activities, relevant equipment purchases, travel directly connected to the project, and IP-related costs. Common ineligible items include general business operating costs, debt repayment, costs incurred before grant approval, and entertainment.

Build your budget line by line from the eligible expenditure definition, not from your general business cost structure. Every line item must be justifiable as either directly required for the specific project or explicitly listed as eligible in the guidelines. If you include a line item that assessors consider ineligible, it raises questions about how carefully you've read the guidelines — and may cause them to scrutinise other budget items more closely.

The Export Market Development Grant (EMDG) illustrates this well. EMDG eligible expenses include: airfares and accommodation for overseas marketing trips, trade fair registration and booth costs, promotional materials specifically for overseas audiences, and fees to overseas marketing consultants. A budget that includes domestic marketing costs, general website redevelopment, or attendance at an Australian trade expo contains ineligible items. EMDG claims with ineligible items are adjusted downward at assessment — which can reduce the payment below the minimum claimable threshold, resulting in no payment at all.

Co-contribution must be clearly stated in the budget, not left implicit. Almost every grant requires you to contribute a percentage of total project cost from your own funds. State it explicitly: "Total project cost: $240,000. Grant requested: $120,000 (50%). Applicant co-contribution: $120,000 (50%), funded from operating cash reserves." Don't leave the assessor to calculate the ratio or wonder whether you've accounted for it.

Writing the capability and capacity section

Most grant programs include an assessment criterion that evaluates whether you can actually deliver what you're proposing. Titles vary — "Organisational capability," "Capacity to deliver," "Team credentials" — but the underlying question is consistent: do you have the people, systems, experience, and financial capacity to execute this project as described?

For established businesses, the evidence for this section comes from your track record: previous projects of similar scope delivered on time, relevant certifications, team credentials, existing partnerships, and any history of managing government funding or compliance obligations. Be concrete — don't say "our team has extensive experience in this area." Name the person, state their relevant credential or specific past project, and connect it to what they'll be doing on this project. "Our project lead, [Name], holds a PhD in materials engineering and led the commercialisation of our previous polymer line, which generated $2.1M in export revenue in its first full year" is credible. "Our team is highly experienced and well-positioned to deliver" is not.

For newer businesses, this section requires more thought. Your entity may not have a comparable past project, but you likely have founder experience, relevant advisors, certifications, or early commercial traction that demonstrates credibility. A business with a signed letter of intent from a first customer is demonstrating market credibility. A founder with 15 years of industry experience in the relevant sector is demonstrating delivery credibility even without a company track record.

Letters of support from credible third parties — industry associations, potential customers, research partners, or government stakeholders — add weight that self-assertion cannot. A letter from a recognised industry figure that says "we have reviewed [Company X's] technical approach and believe it is commercially viable" carries more weight in a capability section than three paragraphs of self-described expertise. Pursue two or three strong, specific letters rather than five generic ones.

One mistake to avoid: overselling team capacity to the point of implausibility for your business size. A $500K turnover company claiming their existing team can absorb a $1.5M project on top of normal operations will raise scepticism. Address this directly — explain exactly how you'll resource the project (new hires, contracted specialists, dedicated time allocation percentages) rather than implying your existing team will absorb the additional workload without issue.

Demonstrating value for money

Almost every grant program includes a value-for-money assessment criterion. This is widely misunderstood as meaning "apply for less money." It doesn't. Value for money means the outcomes you're promising represent a credible and substantial return on the government's investment.

A $100,000 grant that creates 15 full-time jobs in a regional area, generates $3M in export revenue over five years, and catalyses $500,000 in committed private co-investment demonstrates strong value for money. A $20,000 grant that funds a website redevelopment with no clear commercial outcome demonstrates poor value for money — not because the dollar amount is small, but because the return to the program's objectives is unclear.

The leverage framing is effective for this section. Government grants are specifically designed to unlock activity that wouldn't happen, or would happen more slowly, without government funding. Show the assessor what the grant unlocks: "The $150,000 grant will enable us to hire a dedicated commercialisation manager and fund the first two phases of product certification. This directly unlocks $800,000 in private co-investment that our lead investor has conditionally committed pending grant approval and the completion of Phase 1 certification." That's a 6:1 leverage ratio — a compelling value-for-money argument.

For programs with a jobs or economic development objective, calculate the cost per job created and state it. If a $200,000 grant creates 10 full-time equivalent jobs, that's $20,000 per job. Regional development benchmarks in Australia typically consider $15,000–$40,000 per job as strong value for money, depending on job quality and location. If your numbers fall in this range, make it explicit — don't leave the assessor to calculate it and potentially get it wrong.

Red flags that cause applications to be immediately discounted

Experienced assessors develop pattern recognition for applications that will score poorly. Certain phrases, claims, and structural choices signal low quality regardless of the underlying project. Recognising and removing these from your draft prevents otherwise strong applications from being dismissed.

"General operating costs" in the budget. Stating that grant funds will contribute to general operating expenses is an immediate red flag. Grants fund specific project activities, not business-as-usual costs. If your budget includes lines like "overheads," "general administration," or "business operations" without explicitly tying them to project activities in the eligible expenditure categories, revise the budget entirely.

Outcomes that exceed what the project could plausibly deliver. Claiming a $50,000 feasibility grant will generate $10M in revenue within two years, with no credible basis for the projection, invites scepticism about every other claim in the application. Ambitious but grounded projections are fine — explain the methodology. Implausible projections with no evidential basis undermine your credibility across all criteria.

Describing work already completed as future project activity. Presenting work that has already been done — even genuinely impressive work — as part of the proposed project raises the eligibility question of whether you've already "started." If past work is relevant context, clearly identify it as prior work that the proposed project builds upon. Don't describe it as deliverables within the funded timeline.

Generic alignment language. "This project strongly aligns with the program's objectives of innovation and growth" — stated without specific detail — adds nothing and signals that you haven't engaged seriously with the program's specific objectives. Replace it with: "This project directly advances [Program Name]'s stated objective of [specific objective from the guidelines] by [specific mechanism]."

Inconsistent figures across the application. A budget that doesn't add up, revenue projections that contradict the financial statements provided, or a headcount figure that differs between the capability section and the budget notes — these create doubt. Assessors notice numerical inconsistencies, and they signal either poor preparation or dishonesty. Audit all figures before submission.

The pre-submission review — what to check before you send

The hour before you submit is not the time to start reviewing. A proper pre-submission review takes a full working day and should be planned for in your timeline from the outset.

Start with the administrative requirements. Go through the guidelines line by line: every mandatory attachment included (financial statements, CVs, project plan, quotes, letters of support), correct file formats, correct filenames if specified, entity details (ABN, legal name, trading name, contact details) consistent across all sections and attachments, and all declarations signed. These are the items that cause automatic disqualification. They are all verifiable and correctable before submission. There is no excuse for failing here.

Next, read each answer against the assessment criterion it addresses and ask a specific question: "Have I answered what was actually asked?" Not "Have I written something relevant?" but "Have I directly answered the question?" These are different tests. It's common — especially after spending days on a draft — to write a thorough answer to a slightly different question than the one asked. This misalignment costs marks even when the content is strong.

The outside-reader test is the most valuable review you can do. Ask someone with no involvement in the project — ideally someone outside your industry — to read the application and answer three questions: What does this project do? Who benefits from it and how? Why should the government fund it? If they can't answer all three questions clearly and specifically, your application is not clear enough. This test consistently identifies vague language that the writer has become blind to from proximity to the material.

Finally, run through the red flags from the previous section and check each one. Any of them present is worth an hour of revision before submission. The time cost of fixing them before submission is trivial compared to the cost of receiving a preventable rejection.

When to hire a professional grant writer — and when not to

Professional grant writers charge $3,000–$15,000+ for a single application, depending on grant size, program complexity, and the writer's track record. That investment is justified in some situations and poor value in others.

When to DIY: Grants under $50,000, particularly those with simple applications (short responses, clear criteria). Rebates, vouchers, and non-competitive programs where eligibility determines the outcome, not writing quality. Your first application to a program you plan to reapply to — the learning you gain, regardless of outcome, is more valuable than outsourcing it. Programs where you have a very strong pre-existing relationship with the administering body.

When professional help is worth considering: Large competitive grants ($200,000+) where the consultant's fee is a small percentage of the potential funding. Programs where you've been rejected before and can't identify why — experienced consultants who know the program can often identify structural issues that aren't obvious to applicants. Programs with multi-stage processes (expression of interest, then full application, then interview) where each stage matters. Situations where you have a genuinely strong project but poor writing capacity, and the program is worth the investment.

What to look for in a grant consultant: Ask specifically about their experience with the program or program type you're applying to — not just "grant writing" experience in general. Ask for references from clients who received funding from that specific program. Ask about their approach to cases where the application is unsuccessful — what follow-up do they provide? Reputable consultants don't guarantee funding (no one can), and anyone who does is making a promise they can't keep.

One underused option: paying a consultant for a draft review rather than a full writing engagement. An experienced consultant reviewing your completed draft for $500–$1,500 can identify structural and content issues that materially improve your chances — at a fraction of the cost of a full writing engagement. If your project is strong but your application draft needs work, this is often the highest-value use of consulting budget.

Pre-submission checklist — check every item before you send

  • Verified eligibility against every criterion in the guidelines (ABN age, state/jurisdiction, industry, revenue thresholds, employee count, project start date rule)
  • Identified all assessment criteria and built the application structure to directly answer each one
  • Every mandatory question answered — no blanks, no "N/A" where a real answer is required
  • All required attachments included in the correct format (PDF/Word as specified, correct filenames if required)
  • Word limits checked and respected for every section — not just the total
  • Budget uses only eligible expenditure categories as defined in this program's guidelines
  • Co-contribution clearly stated and correctly calculated as a percentage of total project cost
  • All figures are internally consistent — budget totals match, projections are consistent with financials provided
  • Entity details (ABN, legal business name, trading name) are identical across all sections and attachments
  • Outside-reader test passed: someone outside your business can explain the project, the beneficiary, and the reason government should fund it
  • No red-flag language: no "general operating costs," no implausible projections, no work already done described as future deliverables
  • Application submitted at least 24 hours before the deadline — not the night before

Frequently asked questions

How long should a grant application be?

Match the program's stated word or page limits exactly. If no limit is specified, write as much as you need to answer every criterion clearly — typically 1,500–4,000 words for mid-tier grants. Longer is not better. A focused 2,000-word application that directly addresses every criterion will outscore a 5,000-word application that meanders. Every sentence should add information the assessor can score.

Can I reuse an application I wrote for a different grant?

You can reuse research, background material, and project descriptions as a starting point, but each application must be tailored to the specific program's assessment criteria, objectives, and language. Submitting the same document with minor edits to two different programs is obvious to experienced assessors and signals low effort. Adapt the structure and framing to each program — don't copy.

Should I call the grant administrator before applying?

Yes — for one specific purpose: to clarify a borderline eligibility question. For example, "Does a business that registered its ABN 10 months ago meet your 12-month trading requirement?" Grant administrators are generally willing to answer specific eligibility questions by phone or email. Keep it brief and specific. Don't ask them to review your draft, tell you whether you'll succeed, or discuss the assessment process — those conversations aren't appropriate.

What happens if I exceed the word limit?

Online grant portals typically prevent you from exceeding word limits by design. For PDF submissions with stated limits, exceeding them means the assessor is instructed to stop reading at the limit — your extra content simply doesn't count. Write to the limit, not over it. If you're consistently over the limit after editing, that usually means you're answering more than what was asked, not that the limit is too short.

Can I contact the assessors while my application is being reviewed?

No. Attempting to contact assessors or program staff outside the formal process during assessment is inappropriate and can result in your application being flagged or withdrawn. One polite email confirming receipt of your application is acceptable in the first week after submission. After that, wait for the outcome or the estimated decision date stated in the guidelines.

If I'm rejected, can I apply again in the next round?

Usually yes, and with a better chance of success. Request written feedback from the administering agency — most programs will provide brief notes on why your application didn't reach the funding threshold. Use that feedback to identify specific weaknesses. Many businesses that are ultimately funded applied two or three times. A rejection combined with feedback and a revised application is often a better path to funding than giving up after one attempt.

Grant information is compiled from official government sources and updated regularly. Program details, eligibility, and availability change frequently. Always verify current details on the official government website before applying.

Get the Weekly Grants Roundup

New grants, closing deadlines & tips — every week. Free.

No spam. Unsubscribe anytime.