WSQ , IBF, SkillsFuture, PEI Approved Training Provider

WSQ - Fine-Tuning LLM Models and RAG

This WSQ Fine-Tuning LLM Models and RAG equips learners with the technical knowledge to build, fine-tune, and deploy custom large language models (LLMs) and retrieval-augmented generation (RAG) systems. Learners will explore core NLP applications powered by transformers and attention mechanisms, gaining hands-on experience with word embeddings, tokenization, and chunking strategies. The course also introduces AI agents and their integration with LLMs in practical use cases.

Participants will delve into fine-tuning methodologies such as Supervised Fine-Tuning (SFT), Parameter Efficient Fine Tuning (PEFT), and Low-Rank Adaptation (LoRA). Advanced strategies like Group Relative Policy Optimization (GRPO) and Reinforcement Learning are also covered. Learners will implement and deploy LLMs using Hugging Face libraries, datasets, and tokenizers, preparing them for real-world applications in NLP, AI product development, and data science projects.

Learning Outcomes

By end of the course, learners should be able to:

  • LO1: Learners will be able to identify the common NLP applications and use cases based on attention mechanisms and transformer architecture
  • LO2: Learners will be able to perform word embedding and build language models
  • LO3: Learners will be able to train NLP model with various machine learning approaches.
  • LO4: Learners will be able to determine NLP strategies based on Hugging Face framework

Course Brochure

Download WSQ – Advanced NLP with Large Language Models (LLM) Brochure

Skills Framework

This course follows the guideline of Text Analytics and Processing ICT-DIT-5029-1.1 TSC under ICT Skills Framework

Certification

  • Certificate of Completion from Tertiary Infotech - Upon meeting at least 75% attendance and passing the assessment(s), participants will receive a Certificate of Completion from Tertiary Infotech.

  • OpenCerts from SkillsFuture Singapore - After passing the assessment(s) and achieving at least 75% attendance, participants will receive a OpenCert (aka Statement of Achievement) from SkillsFuture Singapore, certifying that they have achieved the Competency Standard(s) in the above Skills Framework.

WSQ Funding

WSQ funding is only applicable to Singaporeans and PR. Subject to eligibility, the funding support is subjected to funding caps.

Effective for courses starting from 1 Jan 2024
Full Fee GST Nett Fee after Funding (Incl. GST)
Baseline MCES / SME
$900 $81.00 $531.00 $351.00

Baseline: Singaporean/PR age 21 and above
MCES(Mid-Career Enhanced Subsidy): S'porean age 40 & above

Upon registration, we will advise further on how to tap on the WSQ Training Subsidy.


You can pay the nett fee (after the WSQ training subsidy) by the following :

SkillsFuture Enterprise Credit (SFEC)

Eligible Singapore-registered companies can tap on $10000 SFEC to cover out-of-pocket expenses.Click here to submit SkillsFuture Enterprise Credit

SkillsFuture Credit (SFC)

Eligible Singapore Citizens can use their SFC to offset course fee payable after funding but the $4,000 Additional SFC (Mid-Career Support) cannot be used. Click here for SkillsFuture Credit submission

UTAP

Eligible NTUC members can apply for 50% of the unfunded fee from UTAP, capped up to $250/year and for members aged 40 and above, capped up to $500/year. Click here to submit UTAP

PSEA

Eligible Singapore Citizens can use their PSEA funds to offset course fee payable after funding.

To check for Post-Secondary Education Account (PSEA) eligibility for this course, Visit SkillsFuture (course code: TGS-2023018987)
  • Scroll down to “Keyword Tags” to verify for PSEA eligibility.
  • If there is “PSEA” under keyword tags, the course is eligible for PSEA.

Once you are eligible for PSEA, please download and fill up the PSEA Withdrawal Form and email to us. 

Course Code: TGS-2023018987

Fee

$900.00 (GST-exclusive)
$981.00 (GST-inclusive)

The course fee listed above is before subsidy/grant, if applicable. We will apply for the grant and send you the invoice with nett fee.

Course Date

* Required Fields

Post-Course Support

  • We provide free consultation related to the subject matter after the course.
  • Please email your queries to enquiry@tertiaryinfotech.com and we will forward your queries to the subject matter experts.

Course Cancellation/Reschedule Policy

  • You can register your interest without upfront payment. There is no penalty for withdrawal of the course before the class commerce.
  • We reserve the right to cancel or re-schedule the course due to unforeseen circumstances. If the course is cancelled, we will refund 100% for any paid amount.
  • Note the venue of the training is subject to changes due to availability of the classroom

Course Details

Topic 1 Introduction to Large Language Models (LLM) and AI Agents

  • Overview of transformer architecture and attention mechanisms in LLMs
  • Introduction to AI agents
  • NLP applications Powered by LLM and AI agents
  • Use cases of LLMs and AI agents

Topic 2 Retrieval-Augmented Generation (RAG)

  • Introduction to Retrieval-Augmented Generation (RAG)
  • Use cases of RAG
  • Overview of tokenization and word embeddings
  • Overview of chunking strategies and vector databases
  • Build a RAG system

Topic 3: Fundamentals of Fine Tunning LLM

  • Fundamentals of LLM Fine Tuning
  • Supervised Fine-Tuning (SFT) for custom LLM Tasks
  • Parameter Efficient Fine Tuning (PEFT)
  • Low-Rank Adaptation (LoRA) for fine tuning LLM
  • Group Relative Policy Optimization (GRPO)
  • Reinforcement Learning (RT Learning) for fine tunning

Topic 4 Fine Tuning LLM Implementation and Deployments

  • Overview of Hugging Face Fine Tuning Libraries
  • Implementing Fine Tuning wiht Hugging Face Libraires
  • Using Hugging Face datasets and tokenizers for LLMs fine tunning
  • Deploying and testing Fine-Tuned models

Final Assessment

  • Written Assessment - Short Answer Questions (WA-SAQ)
  • Practical Performance (PP)

Course Info

Promotion Code

Promo or discount cannot be applied to WSQ courses

Minimum Entry Requirement

Knowledge and Skills

  • Able to operate using computer functions
  • Minimum 3 GCE ‘O’ Levels Passes including English or WPL Level 5 (Average of Reading, Listening, Speaking & Writing Scores)

Attitude

  • Positive Learning Attitude
  • Enthusiastic Learner

Experience

  • Minimum of 1 year of working experience.

Minimum Software/Hardware Requirement

Software:

You can download and install the following software:

Hardware: Windows and Mac Laptops

About Progressive Wage Model (PWM)

The Progressive Wage Model (PWM) helps to increase wages of workers through upgrading skills and improving productivity. 

Employers must ensure that their Singapore citizen and PR workers meet the PWM training requirements of attaining at least 1 Workforce Skills Qualification (WSQ) Statement of Attainment, out of the list of approved WSQ training modules.

For more information on PWM, please visit MOM site.

Funding Eligility Criteria

Individual Sponsored Trainee Employer Sponsored Trainee
  • Singapore Citizens or Singapore Permanent Residents of age 21 and above
  • From 1 October 2023, attendance-taking for SkillsFuture Singapore's (SSG) funded courses must be done digitally via the Singpass App. This applies to both physical and synchronous e-learning courses.​
  • Trainee must pass all prescribed tests / assessments and attain 100% competency.
  • We reserves the right to claw back the funded amount from trainee if he/she did not meet the eligibility criteria.
  • Singapore Citizens or Singapore Permanent Residents who are DIRECT EMPLOYEE of the sponsoring company.
  • From 1 October 2023, attendance-taking for SkillsFuture Singapore's (SSG) funded courses must be done digitally via the Singpass App. This applies to both physical and synchronous e-learning courses.​
  • Trainee must pass all prescribed tests / assessments and attain 100% competency.
  • We reserves the right to claw back the funded amount from the employer if trainee did not meet the eligibility criteria.

 SkillsFuture Credit: 

  • Eligible Singapore Citizens can use their SkillsFuture Credit to offset course fee payable after funding.

 PSEA:

  • To check for Post-Secondary Education Account (PSEA) eligibility, goto mySkillsFuture portal and search for this course code.
  • Scroll down to "Keyword Tags" to verify for PSEA eligibility.
  • If there is “PSEA” under keyword tags, the course is eligible for PSEA.  
  • And if there is no “PSEA” under keyword tags, the course is ineligible for PSEA. 
  • Not all courses are eligible for PSEA funding.

 Absentee Payroll (AP) Funding: 

  • $4.50 per hour, capped at $100,000 per enterprise per calendar year.
  • AP funding will be computed based on the actual number of training hours attended by the trainee.

 SFEC:

  • If the Training Provider has submitted an enrolment for course fee grant claim in Training Partners Gateway (TPGateway), SSG would be able to derive SFEC funding based on this record. There is no need for enterprise to submit any claim request and the SFEC claim will be automatically generated and disbursed.
  • Where there is no such record, eligible employers are required to submit an SFEC claim after course completion via the SFEC microsite.
  • SkillsFuture Enterprise Credit (SFEC) Microsite 

 

Steps to Apply Skills Future Claim

  • The staff will send you an invoice with the fee breakdown.
  • Login to the MySkillsFuture portal, select the course you’re enrolling on and enter the course date and schedule.
  • Enter the course fee payable by you (including GST) and enter the amount of credit to claim.
  • Upload your invoice and click ‘Submit’

SkillsFuture Level-Up Program

The  SkillsFuture Level-Up Programme provides greater structural support for mid-career Singaporeans aged 40 years and above to pursue a substantive skills reboot and stay relevant in a changing economy. For more information, visit SkillsFuture Level-Up Programme

Get Additional Course Fee Support Up to $500 under UTAP

The Union Training Assistance Programme (UTAP) is a training benefit provided to NTUC Union Members with an objective of encouraging them to upgrade with skills training. It is provided to minimize the training cost. If you are a NTUC Union Member then you can get 50% funding (capped at $500 per year) under Union Training Assistance Programme (UTAP).

For more information visit NTUC U Portal – Union Training Assistance Program (UTAP)

Steps to Apply UTAP

  • Log in to your U Portal account to submit your UTAP application upon completion of the course.

Note

  • SSG subsidy is available for Singapore Citizens, Permanent Residents, and Corporates.
  • All Singaporeans aged 25 and above can use their SkillsFuture Credit to pay. For more details, visit www.skillsfuture.gov.sg/credit
  • An unfunded course fee can be claimed via SkillsFuture Credit or paid in cash.
  • UTAP funding for NTUC Union Members is capped at $250 for 39 years and below and at $500 for 40 years and above.
  • UTAP support amount will be paid to training provider first and claimed after end of class by learner.

Appeal Process

  1. The candidate has the right to disagree with the assessment decision made by the assessor.
  2. When giving feedback to the candidate, the assessor must check with the candidate if he agrees with the assessment outcome.
  3. If the candidate agrees with the assessment outcome, the assessor & the candidate must sign the Assessment Summary Record.
  4. If the candidate disagrees with the assessment outcome, he/she should not sign in the Assessment Summary Record.
  5. If the candidate intends to appeal the decision, he/she should first discuss the matter with the assessor/assessment manager.
  6. If the candidate is still not satisfied with the decision, the candidate must notify the assessor of the decision to appeal. The assessor will reflect the candidate’s intention in the Feedback Section of the Assessment Summary Record.
  7. The assessor will notify the assessor manager about the candidate’s intention to lodge an appeal.
  8. The candidate must lodge the appeal within 7 days, giving reasons for appeal 
  9. The assessor can help the candidate with writing and lodging the appeal.
  10. he assessment manager will collect information from the candidate & assessor and give a final decision.
  11. A record of the appeal and any subsequent actions and findings will be made.
  12. An Assessment Appeal Panel will be formed to review and give a decision.
  13. The outcome of the appeal will be made known to the candidate within 2 weeks from the date the appeal was lodged.
  14. The decision of the Assessment Appeal Panel is final and no further appeal will be entertained.
  15. Please click the link below to fill up the Candidates Appeal Form.

Job Roles

  • NLP Engineer
  • Data Scientist (specializing in text data)
  • Machine Learning Engineer (NLP focus)
  • Computational Linguist
  • AI Research Scientist (language models)
  • Chatbot Developer
  • Text Mining Specialist
  • AI Solutions Architect (with NLP projects)
  • Conversational AI Designer
  • Search Algorithm Developer
  • Recommendation System Engineer (content-based)
  • Content Analysis Engineer
  • Information Retrieval Specialist
  • Machine Translation Developer
  • Speech Recognition Engineer.

Trainers

Dr. Alfred Ang: Dr. Alfred Ang is a distinguished technology leader, AI researcher, and educator with over 20 years of experience in artificial intelligence, cybersecurity, and cloud computing. As the Chief Instructional Designer and CTO of Tertiary Infotech, he has spearheaded the development of over 500 advanced technology courses and led multiple AI-driven innovation projects across industries. His expertise spans deep learning, natural language processing, and enterprise AI system design, with a strong focus on fine-tuning large language models (LLMs) and optimizing retrieval-augmented generation (RAG) pipelines.

In “Fine-Tuning LLM Models and RAG,” Dr. Ang provides in-depth insights into customizing foundation models for domain-specific applications. He guides learners through advanced prompt engineering, model retraining, and integration with vector databases for RAG systems. His sessions emphasize practical experimentation with open-source LLM frameworks, enabling participants to build optimized, high-performance AI solutions tailored to real-world enterprise needs.

Tan Woei Ming: Tan Woei Ming is a data scientist and AI engineer with over 15 years of experience in machine learning, deep learning, and AI-driven automation. He has led industrial AI projects in the semiconductor and manufacturing sectors, deploying predictive analytics and computer vision systems for process optimization. Holding a Master’s in Intelligent Systems from the National University of Singapore, Woei Ming is deeply experienced in developing and fine-tuning neural networks using TensorFlow, PyTorch, and Hugging Face libraries.

In “Fine-Tuning LLM Models and RAG,” Woei Ming teaches participants how to customize and deploy fine-tuned LLMs for business-critical workflows. His sessions explore parameter-efficient tuning, embedding generation, and integration with retrieval systems to enhance contextual accuracy. Combining strong theoretical foundations with hands-on experimentation, he helps learners gain the technical expertise needed to adapt large language models for specific organizational domains.

Yeo Hwee Theng: Yeo Hwee Theng is a data science and AI strategist with extensive experience leading enterprise AI adoption across healthcare, finance, and government sectors. As a Data & Analytics Product Lead at Amplify Health and a former AI Architect at Huawei, she has designed and implemented large-scale machine learning and analytics systems. Her academic background includes a Master of Technology in Enterprise Business Analytics from NUS, where she specialized in data architecture and applied AI.

In “Fine-Tuning LLM Models and RAG,” Hwee Theng focuses on aligning data strategy with LLM fine-tuning workflows. Her sessions delve into model evaluation, data curation, and governance for retrieval-augmented AI systems. She emphasizes the practical integration of AI pipelines into enterprise infrastructure, empowering learners to operationalize LLMs responsibly and efficiently across real-world use cases.

Teh Siew Yee: Teh Siew Yee is a data analytics and digital transformation leader with over two decades of experience across banking, aviation, and technology sectors. With a Master of IT in Business (AI) from SMU and leadership roles at Standard Chartered, TikTok, and HP, he has developed expertise in AI governance, data management, and analytics strategy. As an ACLP-certified trainer, he is known for delivering clear, industry-relevant instruction that bridges business goals with technical execution.

In “Fine-Tuning LLM Models and RAG,” Siew Yee helps participants understand the lifecycle of LLM customization—from data preparation to deployment. His sessions highlight best practices for prompt optimization, hybrid retrieval techniques, and responsible AI alignment. Through guided labs and real-world examples, he equips learners with the skills to fine-tune and deploy scalable, explainable AI models using modern RAG frameworks.

Truman Ng: Truman Ng is an AI infrastructure and cloud automation specialist with more than 20 years of experience in enterprise networking, cybersecurity, and intelligent systems integration. He holds PMP, ACTA, and Huawei HCIE certifications and has trained global corporate teams in DevOps, AI deployment, and cloud orchestration. His expertise lies in building scalable AI pipelines and integrating model fine-tuning within secure cloud-based environments.

In “Fine-Tuning LLM Models and RAG,” Truman teaches how to operationalize and optimize fine-tuned LLMs within hybrid and distributed infrastructures. His sessions focus on model deployment, GPU optimization, and the secure management of vector databases for RAG workflows. By merging AI engineering with infrastructure best practices, he enables learners to design, deploy, and maintain robust end-to-end AI systems with enterprise-level scalability.

Customer Reviews (8)

will recommend Review by Course Participant/Trainee
1. Do you find the course meet your expectation?
2. Do you find the trainer knowledgeable in this subject?
3. How do you find the training environment
. (Posted on 4/6/2025)
will recomnend Review by Course Participant/Trainee
1. Do you find the course meet your expectation?
2. Do you find the trainer knowledgeable in this subject?
3. How do you find the training environment
. (Posted on 4/6/2025)
will recommend Review by Course Participant/Trainee
1. Do you find the course meet your expectation?
2. Do you find the trainer knowledgeable in this subject?
3. How do you find the training environment
. (Posted on 12/18/2024)
will recommend Review by Course Participant/Trainee
1. Do you find the course meet your expectation?
2. Do you find the trainer knowledgeable in this subject?
3. How do you find the training environment
The trainer is very knowledgeable. I hope the source code can tally more with the slides. (Posted on 10/27/2024)
will recommend Review by Course Participant/Trainee
1. Do you find the course meet your expectation?
2. Do you find the trainer knowledgeable in this subject?
3. How do you find the training environment
. (Posted on 1/27/2024)
Interesting course in this GPT era. Review by Course Participant/Trainee
1. Do you find the course meet your expectation?
2. Do you find the trainer knowledgeable in this subject?
3. How do you find the training environment
So far so good, the pace is match with my learning. (Posted on 12/1/2023)
Already good Review by Course Participant/Trainee
1. Do you find the course meet your expectation?
2. Do you find the trainer knowledgeable in this subject?
3. How do you find the training environment
Already good (Posted on 12/1/2023)
will recommend Review by Course Participant/Trainee
1. Do you find the course meet your expectation?
2. Do you find the trainer knowledgeable in this subject?
3. How do you find the training environment
. (Posted on 5/25/2023)

Write Your Own Review

You're reviewing: WSQ - Fine-Tuning LLM Models and RAG

How do you rate this product? *

  1 star 2 stars 3 stars 4 stars 5 stars
1. Do you find the course meet your expectation?
2. Do you find the trainer knowledgeable in this subject?
3. How do you find the training environment
  • Reload captcha

You May Be Interested In These Courses

WSQ - Python Fundamental Course for Beginners

WSQ - Python Fundamental Course for Beginners

536 Review(s)
$750.00 (GST-exclusive)
$817.50 (GST-inclusive)
WSQ - Build and Deploy Python Applications with Vibe Coding

WSQ - Build and Deploy Python Applications with Vibe Coding

171 Review(s)
$750.00 (GST-exclusive)
$817.50 (GST-inclusive)
WSQ - Build Agentic AI and NLP Applications with Langflow

WSQ - Build Agentic AI and NLP Applications with Langflow

29 Review(s)
$750.00 (GST-exclusive)
$817.50 (GST-inclusive)
WSQ - Text Analytics with R

WSQ - Text Analytics with R

2 Review(s)
$720.00 (GST-exclusive)
$784.80 (GST-inclusive)
WSQ - Python Text Mining and Analytics: Transforming Text into Insights

WSQ - Python Text Mining and Analytics: Transforming Text into Insights

3 Review(s)
$720.00 (GST-exclusive)
$784.80 (GST-inclusive)