or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

color-channels.mdcomposition.mdconstructor-input.mdindex.mdmetadata-stats.mdoperations-filters.mdoutput-formats.mdresize-geometry.mdutilities-performance.md

metadata-stats.mddocs/

0

# Metadata and Statistics

1

2

Sharp provides comprehensive access to image metadata and pixel-level statistics without requiring full image decoding, enabling efficient analysis and processing workflows.

3

4

## Capabilities

5

6

### Image Metadata Access

7

8

Retrieve comprehensive image metadata including format, dimensions, color information, and embedded data.

9

10

```javascript { .api }

11

/**

12

* Access image metadata without decoding pixel data

13

* @returns Promise resolving to metadata object

14

*/

15

metadata(): Promise<Metadata>;

16

17

interface Metadata {

18

/** Image format (jpeg, png, webp, gif, tiff, etc.) */

19

format: string;

20

/** Total file size in bytes (Buffer/Stream input only) */

21

size?: number;

22

/** Image width in pixels (before EXIF orientation) */

23

width: number;

24

/** Image height in pixels (before EXIF orientation) */

25

height: number;

26

/** Color space interpretation */

27

space: string;

28

/** Number of channels (1=grey, 2=grey+alpha, 3=RGB, 4=RGBA/CMYK) */

29

channels: number;

30

/** Pixel depth format (uchar, char, ushort, float, etc.) */

31

depth: string;

32

/** Image density in pixels per inch (if present) */

33

density?: number;

34

/** Chroma subsampling (JPEG: 4:2:0, 4:4:4, etc.) */

35

chromaSubsampling?: string;

36

/** Progressive/interlaced encoding */

37

isProgressive: boolean;

38

/** Palette-based encoding (GIF, PNG) */

39

isPalette: boolean;

40

/** Bits per sample for each channel */

41

bitsPerSample?: number;

42

/** EXIF orientation value (1-8) */

43

orientation?: number;

44

/** Auto-oriented dimensions */

45

autoOrient: {

46

width: number;

47

height: number;

48

};

49

/** Embedded ICC color profile present */

50

hasProfile: boolean;

51

/** Alpha transparency channel present */

52

hasAlpha: boolean;

53

/** Number of pages/frames (multi-page formats) */

54

pages?: number;

55

/** Height of each page in multi-page image */

56

pageHeight?: number;

57

/** Animation loop count (0 = infinite) */

58

loop?: number;

59

/** Frame delays in milliseconds */

60

delay?: number[];

61

/** Primary page number (HEIF) */

62

pagePrimary?: number;

63

/** Default background color */

64

background?: { r: number; g: number; b: number } | { gray: number };

65

/** Multi-level image details (OpenSlide) */

66

levels?: LevelMetadata[];

67

/** Sub Image File Directories count (OME-TIFF) */

68

subifds?: number;

69

/** Resolution unit */

70

resolutionUnit?: 'inch' | 'cm';

71

/** ImageMagick format identifier */

72

formatMagick?: string;

73

/** HEIF compression format */

74

compression?: 'av1' | 'hevc';

75

/** PNG text comments */

76

comments?: CommentsMetadata[];

77

/** Raw metadata buffers */

78

exif?: Buffer;

79

icc?: Buffer;

80

iptc?: Buffer;

81

xmp?: Buffer;

82

xmpAsString?: string;

83

tifftagPhotoshop?: Buffer;

84

}

85

86

interface LevelMetadata {

87

width: number;

88

height: number;

89

}

90

91

interface CommentsMetadata {

92

keyword: string;

93

text: string;

94

}

95

```

96

97

**Usage Examples:**

98

99

```javascript

100

// Basic metadata inspection

101

const metadata = await sharp('image.jpg').metadata();

102

console.log(`Format: ${metadata.format}`);

103

console.log(`Dimensions: ${metadata.width}x${metadata.height}`);

104

console.log(`Channels: ${metadata.channels}`);

105

console.log(`Has Alpha: ${metadata.hasAlpha}`);

106

console.log(`Color Space: ${metadata.space}`);

107

108

// Check for specific features

109

if (metadata.isProgressive) {

110

console.log('Image uses progressive encoding');

111

}

112

113

if (metadata.orientation && metadata.orientation !== 1) {

114

console.log(`EXIF orientation: ${metadata.orientation}`);

115

console.log(`Auto-oriented size: ${metadata.autoOrient.width}x${metadata.autoOrient.height}`);

116

}

117

118

// Multi-page image handling

119

if (metadata.pages && metadata.pages > 1) {

120

console.log(`Multi-page image with ${metadata.pages} pages`);

121

console.log(`Page height: ${metadata.pageHeight}`);

122

}

123

124

// Animation information

125

if (metadata.delay) {

126

console.log(`Animated with delays: ${metadata.delay.join(', ')}ms`);

127

console.log(`Loop count: ${metadata.loop || 'infinite'}`);

128

}

129

```

130

131

### Pixel Statistics

132

133

Analyze pixel-level statistics for each channel without full image processing.

134

135

```javascript { .api }

136

/**

137

* Access pixel-derived statistics for every channel

138

* @returns Promise resolving to statistics object

139

*/

140

stats(): Promise<Stats>;

141

142

interface Stats {

143

/** Per-channel statistics array */

144

channels: ChannelStats[];

145

/** Image opacity analysis */

146

isOpaque: boolean;

147

/** Histogram-based entropy (experimental) */

148

entropy: number;

149

/** Laplacian-based sharpness estimation (experimental) */

150

sharpness: number;

151

/** Dominant color in sRGB space (experimental) */

152

dominant: { r: number; g: number; b: number };

153

}

154

155

interface ChannelStats {

156

/** Minimum pixel value in channel */

157

min: number;

158

/** Maximum pixel value in channel */

159

max: number;

160

/** Sum of all pixel values */

161

sum: number;

162

/** Sum of squared pixel values */

163

squaresSum: number;

164

/** Mean pixel value */

165

mean: number;

166

/** Standard deviation */

167

stdev: number;

168

/** X coordinate of minimum pixel */

169

minX: number;

170

/** Y coordinate of minimum pixel */

171

minY: number;

172

/** X coordinate of maximum pixel */

173

maxX: number;

174

/** Y coordinate of maximum pixel */

175

maxY: number;

176

}

177

```

178

179

**Usage Examples:**

180

181

```javascript

182

// Comprehensive statistics analysis

183

const stats = await sharp('photo.jpg').stats();

184

185

console.log(`Image is ${stats.isOpaque ? 'opaque' : 'transparent'}`);

186

console.log(`Entropy: ${stats.entropy.toFixed(2)}`);

187

console.log(`Sharpness: ${stats.sharpness.toFixed(2)}`);

188

console.log(`Dominant color: RGB(${stats.dominant.r}, ${stats.dominant.g}, ${stats.dominant.b})`);

189

190

// Per-channel analysis

191

stats.channels.forEach((channel, index) => {

192

const channelName = ['Red', 'Green', 'Blue', 'Alpha'][index] || `Channel ${index}`;

193

console.log(`${channelName} channel:`);

194

console.log(` Range: ${channel.min} - ${channel.max}`);

195

console.log(` Mean: ${channel.mean.toFixed(2)}`);

196

console.log(` Std Dev: ${channel.stdev.toFixed(2)}`);

197

console.log(` Min at: (${channel.minX}, ${channel.minY})`);

198

console.log(` Max at: (${channel.maxX}, ${channel.maxY})`);

199

});

200

201

// Quality assessment

202

const brightness = stats.channels.slice(0, 3).reduce((sum, ch) => sum + ch.mean, 0) / 3;

203

const contrast = stats.channels.slice(0, 3).reduce((sum, ch) => sum + ch.stdev, 0) / 3;

204

205

console.log(`Average brightness: ${brightness.toFixed(1)}`);

206

console.log(`Average contrast: ${contrast.toFixed(1)}`);

207

208

if (stats.sharpness < 50) {

209

console.log('Image appears to be blurred');

210

} else if (stats.sharpness > 200) {

211

console.log('Image appears to be very sharp');

212

}

213

```

214

215

### Metadata Processing Workflows

216

217

**Batch Metadata Extraction:**

218

219

```javascript

220

const analyzeDirectory = async (directory) => {

221

const files = await fs.readdir(directory);

222

const imageFiles = files.filter(f => /\.(jpg|jpeg|png|webp|tiff)$/i.test(f));

223

224

const results = await Promise.all(

225

imageFiles.map(async (file) => {

226

const filePath = path.join(directory, file);

227

try {

228

const [metadata, stats] = await Promise.all([

229

sharp(filePath).metadata(),

230

sharp(filePath).stats()

231

]);

232

233

return {

234

file,

235

format: metadata.format,

236

dimensions: `${metadata.width}x${metadata.height}`,

237

size: metadata.size,

238

channels: metadata.channels,

239

hasAlpha: metadata.hasAlpha,

240

colorSpace: metadata.space,

241

sharpness: stats.sharpness,

242

brightness: stats.channels.slice(0, 3).reduce((sum, ch) => sum + ch.mean, 0) / 3

243

};

244

} catch (error) {

245

return { file, error: error.message };

246

}

247

})

248

);

249

250

return results;

251

};

252

```

253

254

**Quality Assessment:**

255

256

```javascript

257

const assessImageQuality = async (imagePath) => {

258

const [metadata, stats] = await Promise.all([

259

sharp(imagePath).metadata(),

260

sharp(imagePath).stats()

261

]);

262

263

const assessment = {

264

file: imagePath,

265

format: metadata.format,

266

dimensions: { width: metadata.width, height: metadata.height },

267

quality: {

268

sharpness: stats.sharpness,

269

entropy: stats.entropy,

270

isProgressive: metadata.isProgressive,

271

compression: metadata.chromaSubsampling

272

},

273

issues: []

274

};

275

276

// Quality checks

277

if (stats.sharpness < 30) {

278

assessment.issues.push('Image appears blurred');

279

}

280

281

if (stats.entropy < 6) {

282

assessment.issues.push('Low image complexity');

283

}

284

285

if (metadata.width < 300 || metadata.height < 300) {

286

assessment.issues.push('Low resolution');

287

}

288

289

// Channel analysis for exposure issues

290

const rgbChannels = stats.channels.slice(0, 3);

291

const avgBrightness = rgbChannels.reduce((sum, ch) => sum + ch.mean, 0) / 3;

292

293

if (avgBrightness < 50) {

294

assessment.issues.push('Underexposed');

295

} else if (avgBrightness > 200) {

296

assessment.issues.push('Overexposed');

297

}

298

299

// Check for clipping

300

const hasClipping = rgbChannels.some(ch => ch.min === 0 || ch.max === 255);

301

if (hasClipping) {

302

assessment.issues.push('Clipped highlights/shadows');

303

}

304

305

return assessment;

306

};

307

```

308

309

**Smart Processing Based on Metadata:**

310

311

```javascript

312

const smartProcess = async (input, output) => {

313

const metadata = await sharp(input).metadata();

314

315

let pipeline = sharp(input);

316

317

// Auto-orient if needed

318

if (metadata.orientation && metadata.orientation !== 1) {

319

pipeline = pipeline.autoOrient();

320

}

321

322

// Resize large images for web

323

if (metadata.width > 2000 || metadata.height > 2000) {

324

pipeline = pipeline.resize({

325

width: 1920,

326

height: 1920,

327

fit: 'inside',

328

withoutEnlargement: true

329

});

330

}

331

332

// Choose output format based on content

333

if (metadata.hasAlpha) {

334

// Preserve transparency

335

pipeline = pipeline.png({ quality: 90 });

336

} else if (metadata.format === 'jpeg' || metadata.channels === 3) {

337

// Photographic content

338

pipeline = pipeline.jpeg({ quality: 85, progressive: true });

339

} else {

340

// Graphics/other content

341

pipeline = pipeline.webp({ quality: 90 });

342

}

343

344

await pipeline.toFile(output);

345

};

346

```

347

348

**Metadata-Based Validation:**

349

350

```javascript

351

const validateImage = async (imagePath, requirements = {}) => {

352

const metadata = await sharp(imagePath).metadata();

353

354

const validation = {

355

valid: true,

356

errors: [],

357

warnings: []

358

};

359

360

// Format validation

361

if (requirements.formats && !requirements.formats.includes(metadata.format)) {

362

validation.valid = false;

363

validation.errors.push(`Invalid format: ${metadata.format}. Expected: ${requirements.formats.join(', ')}`);

364

}

365

366

// Dimension validation

367

if (requirements.minWidth && metadata.width < requirements.minWidth) {

368

validation.valid = false;

369

validation.errors.push(`Width too small: ${metadata.width}px < ${requirements.minWidth}px`);

370

}

371

372

if (requirements.maxWidth && metadata.width > requirements.maxWidth) {

373

validation.warnings.push(`Width large: ${metadata.width}px > ${requirements.maxWidth}px`);

374

}

375

376

if (requirements.minHeight && metadata.height < requirements.minHeight) {

377

validation.valid = false;

378

validation.errors.push(`Height too small: ${metadata.height}px < ${requirements.minHeight}px`);

379

}

380

381

// Aspect ratio validation

382

if (requirements.aspectRatio) {

383

const ratio = metadata.width / metadata.height;

384

const expected = requirements.aspectRatio;

385

const tolerance = requirements.aspectTolerance || 0.1;

386

387

if (Math.abs(ratio - expected) > tolerance) {

388

validation.warnings.push(`Aspect ratio ${ratio.toFixed(2)} differs from expected ${expected}`);

389

}

390

}

391

392

// Color space validation

393

if (requirements.colorSpace && metadata.space !== requirements.colorSpace) {

394

validation.warnings.push(`Color space ${metadata.space} differs from expected ${requirements.colorSpace}`);

395

}

396

397

// Profile validation

398

if (requirements.requireProfile && !metadata.hasProfile) {

399

validation.warnings.push('No embedded color profile found');

400

}

401

402

return validation;

403

};

404

405

// Usage

406

const result = await validateImage('upload.jpg', {

407

formats: ['jpeg', 'png', 'webp'],

408

minWidth: 800,

409

minHeight: 600,

410

maxWidth: 4000,

411

aspectRatio: 16/9,

412

aspectTolerance: 0.2,

413

colorSpace: 'srgb',

414

requireProfile: true

415

});

416

```

417

418

## Advanced Metadata Features

419

420

**EXIF Data Extraction:**

421

422

```javascript

423

const extractExifData = async (imagePath) => {

424

const metadata = await sharp(imagePath).metadata();

425

426

if (metadata.exif) {

427

// EXIF data is available as a Buffer

428

// You would typically use an EXIF parsing library here

429

console.log(`EXIF data size: ${metadata.exif.length} bytes`);

430

}

431

432

if (metadata.orientation) {

433

const orientations = {

434

1: 'Normal',

435

2: 'Flipped horizontally',

436

3: 'Rotated 180°',

437

4: 'Flipped vertically',

438

5: 'Rotated 90° CCW and flipped',

439

6: 'Rotated 90° CW',

440

7: 'Rotated 90° CW and flipped',

441

8: 'Rotated 90° CCW'

442

};

443

444

console.log(`Orientation: ${orientations[metadata.orientation] || 'Unknown'}`);

445

}

446

447

return metadata;

448

};

449

```

450

451

**Performance Considerations:**

452

453

- Metadata access is very fast as it doesn't decode pixel data

454

- Statistics calculation requires pixel processing but is optimized

455

- Use metadata for quick validation before expensive operations

456

- Cache metadata results for repeated access to the same image

457

- Consider using metadata for smart batch processing decisions