基于机器学习多孔生物炭对CO2吸附性能预测
DOI:
作者:
作者单位:

1.广西大学 土木建筑工程学院;2.华南理工大学亚热带建筑科学国家重点实验室

作者简介:

通讯作者:

中图分类号:

基金项目:


Predicting the CO2 adsorption capacity of porous biochar based on machine learning
Author:
Affiliation:

1.School of Civil Engineering and Architecture, Guangxi University;2.State Key Laboratory of Subtropical Building Science, South China University of Technology

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    多孔生物炭含有丰富的多尺度孔隙结构,使其具有优异的CO2吸附性能。针对传统以试验数据建立的CO2吸附预测模型存在精度低、计算复杂等不足,本文采用梯度提升决策树(GBDT)、极端梯度增强算法(XGB)、轻型梯度增压机算法(LGBM)等机器学习方法对生物炭吸附CO2进行模型预测,并对预测结果进行对比分析。结果表明:影响CO2吸附量的前三个因素依次为生物炭的比表面积、C含量、O含量。3种算法均可以有效地预测生物炭对CO2吸附性能。相比之下,LGBM的预测精度最高,达到94%;GBDT对于异常样本数据处理方面有显著优势;而XGB对不同测试集变化的预测结果更加稳定。该模型结果可为调控和优化生物炭组成和结构提供具有重要的参考意义。

    Abstract:

    Porous biochar contains rich multi-scale pore structure, which makes it have excellent CO2 adsorption performance. To address the shortcomings of the traditional CO2 adsorption prediction models built with experimental data, such as low accuracy and complicated calculation, this paper adopts machine learning methods such as gradient boosting decision tree (GBDT), extreme gradient enhancement algorithm (XGB), and light gradient booster algorithm (LGBM) to make model predictions of CO2 adsorption by biochar, and conducts a comparative analysis of the prediction results. The results showed that the top three factors affecting CO2 adsorption were the specific surface area, C content, and O content of biochar in order. all three algorithms could effectively predict the CO2 adsorption performance of biochar. In comparison, LGBM has the highest prediction accuracy of 94%; GBDT has significant advantages for anomalous sample data processing; and XGB has more stable prediction results for different test set variations. The results of this model can provide important references for regulating and optimizing the composition and structure of biochar.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2022-12-03
  • 最后修改日期:2023-03-10
  • 录用日期:2023-05-19
  • 在线发布日期:
  • 出版日期: