当前位置: 首页 > article >正文

【人工智能】英文学习材料01(每日一句)

🌻个人主页:相洋同学
🥇学习在于行动、总结和坚持,共勉!

目录

1.Natural Language Processing,NLP(自然语言处理)

2.Machine Learing,ML(机器学习)

3.Neural Networks(神经网络)

4.Deep Learing(深度学习)

5.Loss Function (损失函数)

6.Gradient Descent (梯度下降)

7.Stochastic Gradient Descent, SGD (随机梯度下降)

8.Mini-batch Gradient Descent (小批量梯度下降)

9.Backpropagation (反向传播)

10.Overfitting (过拟合)


1.Natural Language Processing,NLP(自然语言处理)

Natural Language Processing (NLP) is the field of artificial intelligence that enables computers to understand, interpret, and generate human language. It bridges the gap between human communication and computer understanding, making it possible for machines to perform tasks like translation, sentiment analysis, and topic classification.

  • interpret--解释、理解
  • bridges the gap -- 桥接差距
  • perform tasks -- 执行任务
  • sentiment analysis -- 情感分析
  • topic classification -- 主题分类

2.Machine Learing,ML(机器学习)

This is a subset of artificial intelligence that involves algorithms and statistical models that enable computers to perform specific tasks without using explicit instructions. Instead, they rely on patterns and inference derived from data. The goal of ML is to enable computers to learn from and make predictions or decisions based on data.

  • subet -- 子集
  • algorithms -- 算法
  • statistical models -- 统计模型
  • specific tasks -- 特定任务
  • explicit instructions -- 明确的指令
  • patterns -- 模式
  • inference -- 推理
  • derived from -- 源自

3.Neural Networks(神经网络)

Inspired by the human brain, neural networks are a set of algorithms, modeled loosely after the human brain, that are designed to recognize patterns. They interpret sensory data through a kind of machine perception, labeling, or clustering of raw input. These networks can adapt to changing input, meaning they generate the best possible result without needing to redesign the output criteria.

  • Inspired by -- 受启发于 
  • modeled loosely after -- 大致模仿,model有模仿的意思,loosely有偏差的
  • recognize patterns -- 识别模式
  • sensory data -- 感官数据
  • perception -- 感知、感觉
  • clustering -- 聚类
  • raw input -- 原始输入
  • adapt to -- 适应
  • changing -- chage的现在分词
  • redesign -- 重新设计
  • criteria -- 标准

4.Deep Learing(深度学习)

Deep Learning is a subset of machine learning in artificial intelligence that structures algorithms in layers to create an "artificial neural network" that can learn and make intelligent decisions on its own. This technology powers advanced applications such as voice recognition and image analysis.

  • subset -- 子集
  • structures -- 组织
  • layers -- 层
  • powers advanced applications -- 驱动高级应用
  • voice recognition -- 语音识别
  • image analysis -- 图像分析

5.Loss Function (损失函数)

A Loss Function in machine learning measures the difference between the actual output and the predicted output of the model. It quantifies how well the prediction model performs by assigning a cost to prediction errors.

  • actual output -- 实际输出
  • predicted output -- 预测输出
  • quantifies -- 量化
  • assigning -- 分配

6.Gradient Descent (梯度下降)

Gradient Descent is an optimization algorithm used to minimize some function by iteratively moving towards the minimum value of the function. It is commonly used in machine learning to find the best parameters for a model.

  • gradient -- 梯度
  • optimization algorithm -- 优化算法
  • minimize -- 最小化
  • iteratively -- 迭代地
  • minimum value -- 最小值
  • commonly -- 普遍地
  • parameters -- 参数

7.Stochastic Gradient Descent, SGD (随机梯度下降)

Stochastic Gradient Descent (SGD) is a variation of the gradient descent algorithm that updates the model's parameters using only a single sample or a small batch of samples, which makes the process faster and can help avoid local minima.

  • stochastic -- 随机的
  • variation -- 变体
  • batch -- 批量
  • local minima -- 局部最小值

8.Mini-batch Gradient Descent (小批量梯度下降)

Mini-batch Gradient Descent is a balance between the full batch gradient descent and stochastic gradient descent. It updates the model's parameters using a subset of the training data, rather than the full dataset or individual samples, optimizing computational efficiency.

  • full batch -- 全批量
  • subset -- 子集
  • training data -- 训练数据
  • computational efficiency -- 计算效率

9.Backpropagation (反向传播)

Backpropagation is a method used in artificial neural networks to calculate the gradient of the loss function with respect to each weight by the chain rule, effectively allowing for the optimization of weights to minimize loss.

  • calculate -- 计算
  • respect -- 关于
  • chain rule -- 链规则

10.Overfitting (过拟合)

Overfitting occurs when a machine learning model learns the detail and noise in the training data to the extent that it negatively impacts the model's performance on new data. This means the model is too complex, capturing noise as if it were a significant pattern, leading to poor generalization on unseen data.

  • occurs -- 出现
  • detail and noise -- 细节和噪声
  • to the extent that -- 到...的程度
  • negatively impacts -- 负面影响
  • performance -- 性能
  • capturing noise -- 捕捉噪声
  • significant pattern -- 重要模式
  • poor generalization -- 泛化能力差
  • unseen data -- 未见数据

以上

君子坐而论道,少年起而行之,共勉


http://www.kler.cn/a/272490.html

相关文章:

  • [LeetCode] 哈希表 I — 242#有效的字母异位词 | 349#两个数组的交集 | 202#快乐数 | 1#两数之和
  • 【分类】【损失函数】处理类别不平衡:CEFL 和 CEFL2 损失函数的实现与应用
  • 【Idea】编译Spring源码 read timeout 问题
  • Word2Vec中的CBOW模型训练原理详细解析
  • 讲一下ZooKeeper的持久化机制?
  • LLM - 大模型 ScallingLaws 的 C=6ND 公式推导 教程(1)
  • 计算机设计大赛 题目:基于深度学习卷积神经网络的花卉识别 - 深度学习 机器视觉
  • 第一位女皇吕雉心狠手辣,斩杀韩信逼死亲儿子
  • 【爬虫逆向】Python逆向采集猫眼电影票房数据
  • 【前端】-css的详解
  • 某夕夕商品数据抓取逆向之webpack扣取
  • 旧华硕电脑开机非常慢 电脑开机黑屏很久才显示品牌logo导致整体开机速度非常的慢怎么办
  • 【目录】Java程序设计课程学习导航(更新中)
  • 2024年56套包含java,ssm,springboot的平台设计与实现项目系统开发资源(可运行源代码+设计文档)分享【万字长文收藏耐心看】
  • Sentinel篇:线程隔离和熔断降级
  • SpringBoot3框架,Web开发(下)
  • threejs案例,与静态三角形网格的基本碰撞, 鼠标环顾四周并投球游戏
  • 前端小白的学习之路(事件流)
  • System Verilog的接口、程序块与断言解析
  • vue2点击左侧的树节点(el-tree)定位到对应右侧树形表格(el-table)的位置,树形表格懒加载
  • 组合逻辑电路(四)
  • 软件测试相关内容第四弹 -- 测试用例与测试分类
  • STM32的简单介绍
  • Python爬虫与数据可视化源码免费领取
  • Monorepo 解决方案 — 基于 Bazel 的 Xcode 性能优化实践
  • 如何实现图片上传至服务器