or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

advanced-apis.mdindex.mdkeypoints.mdmodel-loading.mdmulti-pose.mdpose-utilities.mdsingle-pose.md

keypoints.mddocs/

0

# Keypoint System and Body Parts

1

2

Constants and data structures defining the 17-point human skeleton model used by PoseNet for pose detection and analysis.

3

4

## Capabilities

5

6

### Body Part Names

7

8

Array of human body part names in the standard order used by PoseNet.

9

10

```typescript { .api }

11

/**

12

* Array of 17 body part names in standard keypoint order

13

* Index corresponds to keypoint ID in pose detection

14

*/

15

const partNames: string[];

16

```

17

18

The `partNames` array contains these 17 body parts in order:

19

20

```typescript

21

const partNames = [

22

'nose', // 0

23

'leftEye', // 1

24

'rightEye', // 2

25

'leftEar', // 3

26

'rightEar', // 4

27

'leftShoulder', // 5

28

'rightShoulder', // 6

29

'leftElbow', // 7

30

'rightElbow', // 8

31

'leftWrist', // 9

32

'rightWrist', // 10

33

'leftHip', // 11

34

'rightHip', // 12

35

'leftKnee', // 13

36

'rightKnee', // 14

37

'leftAnkle', // 15

38

'rightAnkle' // 16

39

];

40

```

41

42

**Usage Examples:**

43

44

```typescript

45

import { partNames } from '@tensorflow-models/posenet';

46

47

// Get part name by index

48

const noseName = partNames[0]; // 'nose'

49

const leftWristName = partNames[9]; // 'leftWrist'

50

51

// Find keypoint by part name

52

const pose = await net.estimateSinglePose(imageElement);

53

const noseKeypoint = pose.keypoints.find(kp => kp.part === 'nose');

54

55

// Iterate through all body parts

56

partNames.forEach((partName, index) => {

57

const keypoint = pose.keypoints[index];

58

console.log(`${partName} (${index}): confidence ${keypoint.score.toFixed(2)}`);

59

});

60

61

// Filter keypoints by body region

62

const faceParts = partNames.filter(name =>

63

['nose', 'leftEye', 'rightEye', 'leftEar', 'rightEar'].includes(name)

64

);

65

66

const armParts = partNames.filter(name =>

67

name.includes('Shoulder') || name.includes('Elbow') || name.includes('Wrist')

68

);

69

```

70

71

### Body Part IDs

72

73

Object mapping body part names to their corresponding numeric IDs.

74

75

```typescript { .api }

76

/**

77

* Mapping from body part names to numeric IDs

78

* Inverse of partNames array for efficient lookups

79

*/

80

const partIds: {[jointName: string]: number};

81

```

82

83

**Usage Examples:**

84

85

```typescript

86

import { partIds } from '@tensorflow-models/posenet';

87

88

// Look up part ID by name

89

const noseId = partIds['nose']; // 0

90

const leftWristId = partIds['leftWrist']; // 9

91

const rightAnkleId = partIds['rightAnkle']; // 16

92

93

// Access keypoint by part name

94

const pose = await net.estimateSinglePose(imageElement);

95

const noseKeypoint = pose.keypoints[partIds['nose']];

96

const leftShoulderKeypoint = pose.keypoints[partIds['leftShoulder']];

97

98

// Check if specific parts are detected with high confidence

99

const requiredParts = ['nose', 'leftShoulder', 'rightShoulder'];

100

const detectedParts = requiredParts.filter(partName => {

101

const keypoint = pose.keypoints[partIds[partName]];

102

return keypoint.score > 0.7;

103

});

104

105

console.log(`Detected ${detectedParts.length}/${requiredParts.length} required parts`);

106

107

// Create custom keypoint filters

108

function getKeypointsByRegion(pose: Pose, region: 'face' | 'arms' | 'legs'): Keypoint[] {

109

const regionParts = {

110

face: ['nose', 'leftEye', 'rightEye', 'leftEar', 'rightEar'],

111

arms: ['leftShoulder', 'rightShoulder', 'leftElbow', 'rightElbow', 'leftWrist', 'rightWrist'],

112

legs: ['leftHip', 'rightHip', 'leftKnee', 'rightKnee', 'leftAnkle', 'rightAnkle']

113

};

114

115

return regionParts[region].map(partName => pose.keypoints[partIds[partName]]);

116

}

117

```

118

119

### Pose Chain Structure

120

121

Array defining parent-child relationships in the pose skeleton for pose assembly and tracking.

122

123

```typescript { .api }

124

/**

125

* Parent-child relationships defining pose skeleton structure

126

* Used for pose assembly algorithms and skeleton drawing

127

* Each pair represents a connection in the pose tree structure

128

*/

129

const poseChain: [string, string][];

130

```

131

132

The pose chain defines the tree structure with nose as root:

133

134

```typescript

135

const poseChain = [

136

['nose', 'leftEye'],

137

['leftEye', 'leftEar'],

138

['nose', 'rightEye'],

139

['rightEye', 'rightEar'],

140

['nose', 'leftShoulder'],

141

['leftShoulder', 'leftElbow'],

142

['leftElbow', 'leftWrist'],

143

['leftShoulder', 'leftHip'],

144

['leftHip', 'leftKnee'],

145

['leftKnee', 'leftAnkle'],

146

['nose', 'rightShoulder'],

147

['rightShoulder', 'rightElbow'],

148

['rightElbow', 'rightWrist'],

149

['rightShoulder', 'rightHip'],

150

['rightHip', 'rightKnee'],

151

['rightKnee', 'rightAnkle']

152

];

153

```

154

155

**Usage Examples:**

156

157

```typescript

158

import { poseChain, partIds } from '@tensorflow-models/posenet';

159

160

// Draw skeleton connections

161

function drawPoseSkeleton(pose: Pose, ctx: CanvasRenderingContext2D) {

162

poseChain.forEach(([parentPart, childPart]) => {

163

const parentKeypoint = pose.keypoints[partIds[parentPart]];

164

const childKeypoint = pose.keypoints[partIds[childPart]];

165

166

// Only draw if both keypoints are confident

167

if (parentKeypoint.score > 0.5 && childKeypoint.score > 0.5) {

168

ctx.beginPath();

169

ctx.moveTo(parentKeypoint.position.x, parentKeypoint.position.y);

170

ctx.lineTo(childKeypoint.position.x, childKeypoint.position.y);

171

ctx.stroke();

172

}

173

});

174

}

175

176

// Find pose tree depth from root (nose)

177

function getPoseTreeDepth(): number {

178

const visited = new Set<string>();

179

const depths = new Map<string, number>();

180

181

function traverse(part: string, depth: number) {

182

if (visited.has(part)) return;

183

visited.add(part);

184

depths.set(part, depth);

185

186

poseChain

187

.filter(([parent]) => parent === part)

188

.forEach(([, child]) => traverse(child, depth + 1));

189

}

190

191

traverse('nose', 0);

192

return Math.max(...Array.from(depths.values()));

193

}

194

195

// Validate pose chain connectivity

196

function validatePoseConnectivity(pose: Pose): boolean {

197

return poseChain.every(([parentPart, childPart]) => {

198

const parentKp = pose.keypoints[partIds[parentPart]];

199

const childKp = pose.keypoints[partIds[childPart]];

200

201

// Both keypoints should exist and have reasonable confidence

202

return parentKp && childKp && parentKp.score > 0.1 && childKp.score > 0.1;

203

});

204

}

205

```

206

207

### Part Channels

208

209

Array of body part channel names used internally for pose processing and segmentation.

210

211

```typescript { .api }

212

/**

213

* Body part channel names for internal pose processing

214

* Maps to different body regions for advanced pose analysis

215

*/

216

const partChannels: string[];

217

```

218

219

The part channels represent different body regions:

220

221

```typescript

222

const partChannels = [

223

'left_face',

224

'right_face',

225

'right_upper_leg_front',

226

'right_lower_leg_back',

227

'right_upper_leg_back',

228

'left_lower_leg_front',

229

'left_upper_leg_front',

230

'left_upper_leg_back',

231

'left_lower_leg_back',

232

'right_feet',

233

'right_lower_leg_front',

234

'left_feet',

235

'torso_front',

236

'torso_back',

237

'right_upper_arm_front',

238

'right_upper_arm_back',

239

'right_lower_arm_back',

240

'left_lower_arm_front',

241

'left_upper_arm_front',

242

'left_upper_arm_back',

243

'left_lower_arm_back',

244

'right_hand',

245

'right_lower_arm_front',

246

'left_hand'

247

];

248

```

249

250

**Usage Examples:**

251

252

```typescript

253

import { partChannels } from '@tensorflow-models/posenet';

254

255

// Advanced pose analysis using part channels

256

console.log('Available body part channels:', partChannels.length);

257

258

// Group channels by body region

259

const faceChannels = partChannels.filter(channel => channel.includes('face'));

260

const armChannels = partChannels.filter(channel => channel.includes('arm') || channel.includes('hand'));

261

const legChannels = partChannels.filter(channel => channel.includes('leg') || channel.includes('feet'));

262

const torsoChannels = partChannels.filter(channel => channel.includes('torso'));

263

264

console.log('Face channels:', faceChannels);

265

console.log('Arm channels:', armChannels);

266

console.log('Leg channels:', legChannels);

267

console.log('Torso channels:', torsoChannels);

268

269

// Custom pose analysis based on part channels

270

function analyzePoseRegions() {

271

return {

272

face: faceChannels.length,

273

arms: armChannels.length,

274

legs: legChannels.length,

275

torso: torsoChannels.length,

276

total: partChannels.length

277

};

278

}

279

```

280

281

### Keypoint Count

282

283

Total number of keypoints detected by PoseNet.

284

285

```typescript { .api }

286

/**

287

* Total number of keypoints in PoseNet pose model

288

* Always 17 for the standard human pose model

289

*/

290

const NUM_KEYPOINTS: number;

291

```

292

293

**Usage Example:**

294

295

```typescript

296

import { NUM_KEYPOINTS } from '@tensorflow-models/posenet';

297

298

// Validate pose completeness

299

function isPoseComplete(pose: Pose): boolean {

300

return pose.keypoints.length === NUM_KEYPOINTS;

301

}

302

303

// Calculate pose detection rate

304

function calculateDetectionRate(pose: Pose, minConfidence: number = 0.5): number {

305

const detectedKeypoints = pose.keypoints.filter(kp => kp.score >= minConfidence);

306

return detectedKeypoints.length / NUM_KEYPOINTS;

307

}

308

309

const pose = await net.estimateSinglePose(imageElement);

310

const detectionRate = calculateDetectionRate(pose, 0.7);

311

console.log(`Detected ${(detectionRate * 100).toFixed(1)}% of keypoints with high confidence`);

312

```

313

314

## Supporting Types

315

316

```typescript { .api }

317

interface Keypoint {

318

score: number;

319

position: Vector2D;

320

part: string;

321

}

322

323

interface Vector2D {

324

x: number;

325

y: number;

326

}

327

328

interface Pose {

329

keypoints: Keypoint[];

330

score: number;

331

}

332

```