tva
← Insights

Shopify 的 Reddit 像素跟踪:使用 Conversions API 的服务器端实现

Reddit 的电商跟踪面临着与其他广告平台相同的挑战:基于浏览器的像素越来越多地被广告拦截器、隐私设置和 iOS 跟踪限制所阻止。根据 Reddit 官方文档,Conversions API 提供了“更具弹性”的跟踪,“不易受到广告拦截器和浏览器限制的影响。”

本指南向您展示如何使用 Reddit Conversions API 为 Shopify 商店实现服务器端 Reddit 转化跟踪。您将获得一个基于 Docker 的生产就绪解决方案,可直接从服务器向 Reddit 发送转化数据,绕过所有客户端限制。

Reddit 官方建议同时使用 Pixel 和 Conversions API。引用其文档:“为获得最佳效果,我们建议同时集成 Reddit Pixel 和 Conversions API……CAPI 可以帮助您获得更准确的数据并提高转化覆盖率。”


目录

  1. 为什么需要服务器端跟踪
  2. 架构概述
  3. 前提条件
  4. 步骤 1:Reddit Ads Manager 设置
  5. 步骤 2:构建转化跟踪器
  6. 步骤 3:Docker 配置
  7. 步骤 4:Nginx 反向代理
  8. 步骤 5:SSL 证书
  9. 步骤 6:Shopify Webhook 配置
  10. 步骤 7:测试和验证
  11. 常见问题排查
  12. 监控和维护

为什么需要服务器端跟踪

客户端像素的局限性

Reddit Pixel 是一个在用户浏览器中运行的 JavaScript 代码片段。虽然它适用于基本跟踪,但在2025年面临着重大限制:

  • 广告拦截器:浏览器扩展完全阻止像素请求
  • iOS 应用跟踪透明度:需要用户明确同意
  • 第三方 Cookie 阻止:Safari 和 Firefox 默认阻止
  • 隐私设置:现代浏览器限制跟踪功能

这些不是理论上的问题。根据行业数据,仅使用客户端跟踪可能会丢失30-50%的转化事件。

Reddit 的官方建议

来自 Reddit 的 Conversions API 文档(来源):

“CAPI 对信号丢失更具弹性,因为它在服务器端运行,不易受到广告拦截器和浏览器限制的影响。这可以改善测量、定向投放和优化效果。”

Conversions API 实现服务器到服务器的数据发送。无需浏览器参与意味着没有阻止、没有同意提示,数据完全准确。

为什么两者都需要

Reddit 建议同时使用 Pixel 和 CAPI:

  • Pixel:捕获客户端交互、点击 ID、浏览器上下文
  • CAPI:确保即使 Pixel 被阻止,转化数据仍能到达 Reddit
  • 去重:当两者发送相同的 event_id 时,Reddit 会自动处理重复事件

本指南实现 CAPI 组件。您可以通过 Shopify 的客户事件单独添加 Pixel。


架构概述

工作原理

该实现使用 Shopify webhook 在服务器端捕获订单事件,然后将其转发到 Reddit 的 Conversions API:

Shopify Store (order created)
    |
    v
Shopify Webhook (HTTPS POST)
    |
    v
Nginx Reverse Proxy (SSL termination)
    |
    v
Node.js Application (Docker container)
    |
    v
1. HMAC Verification (security)
2. Event Transformation (Shopify -> Reddit format)
3. PII Hashing (SHA-256)
4. Deduplication Check (PostgreSQL)
5. Send to Reddit CAPI
6. Log Result
    |
    v
Reddit Ads Platform (conversion attributed)

组件

组件用途技术
Webhook 接收器接收 Shopify 订单事件Node.js/Express
HMAC 验证验证 webhook 真实性crypto (SHA-256)
事件转换器将 Shopify 数据转换为 Reddit 格式自定义服务
去重防止重复事件PostgreSQL
Reddit API 客户端向 Reddit 发送事件带重试逻辑的 HTTP 客户端
反向代理SSL 终止、路由Nginx

前提条件

您需要:

  • 服务器:安装了 Docker 的 Linux VPS(最低2GB内存)
  • 域名:指向服务器的子域名(例如 shopify-events.yourdomain.com
  • Reddit Ads 账户:已创建 Pixel 的活跃账户
  • Shopify 商店:配置 webhook 的管理员权限
  • 工具:SSH 访问权限,基本命令行知识

步骤 1:Reddit Ads Manager 设置

创建 Pixel

  1. 前往 Reddit Ads Manager → Events Manager
  2. 点击“创建新 Pixel”
  3. 为像素命名(例如“Shopify Store Conversions”)
  4. 复制 Pixel ID(格式:t2_abc123

生成 Conversions API 访问令牌

  1. 在 Events Manager 中,点击您的像素
  2. 前往“Settings” → “Conversions API”
  3. 点击“Generate Access Token”
  4. 复制令牌(以 Bearer ey... 开头)
  5. 重要提示:请安全保存此令牌。您将无法再次查看。

现在您拥有:

  • REDDIT_PIXEL_ID(例如 t2_abc123
  • REDDIT_ACCESS_TOKEN(例如 eyJhbGciOiJSUzI1NiIsImtpZCI...

步骤 2:构建转化跟踪器

创建项目目录结构:

mkdir -p /opt/reddit-capi/src/{routes,middleware,services,utils}
cd /opt/reddit-capi

package.json

{
  "name": "reddit-capi",
  "version": "1.0.0",
  "type": "module",
  "scripts": {
    "start": "node src/index.js"
  },
  "dependencies": {
    "express": "^4.18.2",
    "pg": "^8.11.3",
    "winston": "^3.11.0",
    "uuid": "^9.0.1"
  }
}

src/config.js

export const config = {
  port: process.env.PORT || 3000,

  shopify: {
    webhookSecret: process.env.SHOPIFY_WEBHOOK_SECRET,
  },

  reddit: {
    pixelId: process.env.REDDIT_PIXEL_ID,
    accessToken: process.env.REDDIT_ACCESS_TOKEN,
    apiUrl: 'https://ads-api.reddit.com/api/v2.0/conversions/events',
  },

  database: {
    host: process.env.DB_HOST || 'postgres',
    port: process.env.DB_PORT || 5432,
    database: process.env.DB_NAME || 'reddit_capi',
    user: process.env.DB_USER || 'reddit_capi',
    password: process.env.DB_PASSWORD,
  },
};

src/utils/logger.js

import winston from 'winston';

export const logger = winston.createLogger({
  level: 'info',
  format: winston.format.combine(
    winston.format.timestamp(),
    winston.format.json()
  ),
  transports: [
    new winston.transports.Console({
      format: winston.format.combine(
        winston.format.colorize(),
        winston.format.simple()
      ),
    }),
  ],
});

src/middleware/shopifyVerify.js

此中间件处理 HMAC 验证和请求体解析:

import crypto from 'crypto';
import { config } from '../config.js';
import { logger } from '../utils/logger.js';

// Read raw body for HMAC verification and parse JSON
export function rawBodyCapture(req, res, next) {
  let data = '';

  req.on('data', (chunk) => {
    data += chunk;
  });

  req.on('end', () => {
    // Store raw body for HMAC verification
    req.rawBody = data;

    // Parse JSON and set as req.body
    // Important: Don't use express.json() middleware - it consumes the stream
    try {
      req.body = JSON.parse(data);
    } catch (error) {
      logger.error('Failed to parse JSON body:', error);
      return res.status(400).json({ error: 'Invalid JSON' });
    }

    next();
  });
}

export function verifyShopifyWebhook(req, res, next) {
  const hmacHeader = req.get('X-Shopify-Hmac-SHA256');

  if (!hmacHeader) {
    logger.warn('Missing HMAC header');
    return res.status(401).json({ error: 'Unauthorized - Missing HMAC' });
  }

  // Calculate HMAC using raw body
  const hash = crypto
    .createHmac('sha256', config.shopify.webhookSecret)
    .update(req.rawBody)
    .digest('base64');

  if (hash !== hmacHeader) {
    logger.warn('HMAC verification failed');
    return res.status(401).json({ error: 'Unauthorized - Invalid HMAC' });
  }

  logger.info('Webhook HMAC verified', {
    topic: req.get('X-Shopify-Topic'),
  });

  next();
}

src/services/crypto.js

import crypto from 'crypto';

export function hashEmail(email) {
  if (!email) return null;

  // Normalize: lowercase and trim
  const normalized = email.toLowerCase().trim();

  // SHA-256 hash
  return crypto.createHash('sha256').update(normalized).digest('hex');
}

export function hashPhone(phone) {
  if (!phone) return null;

  // Remove all non-digits
  const normalized = phone.replace(/\D/g, '');

  return crypto.createHash('sha256').update(normalized).digest('hex');
}

src/services/shopifyTransformer.js

将 Shopify 订单数据转换为 Reddit CAPI 格式:

import { v4 as uuidv4 } from 'uuid';
import { hashEmail, hashPhone } from './crypto.js';

export function transformOrderToPurchase(order) {
  const eventId = `shopify_order_${order.id}_${Date.now()}`;

  const redditEvent = {
    event_at: order.created_at,
    event_type: {
      tracking_type: 'Purchase',
    },
    user: {
      // Hash PII for privacy
      email: hashEmail(order.email),
      ...(order.phone && { phone_number: hashPhone(order.phone) }),

      // IP and User Agent if available
      ...(order.browser_ip && { ip_address: order.browser_ip }),
      ...(order.client_details?.user_agent && {
        user_agent: order.client_details.user_agent
      }),
    },
    event_metadata: {
      // Reddit requires value as integer in minor currency units (cents)
      value: Math.round((parseFloat(order.total_price) || 0) * 100),

      // Unique ID for deduplication
      conversion_id: String(order.id),

      currency: order.currency,

      // Optional: Product details
      item_count: order.line_items?.length || 0,

      // Custom data for tracking
      order_number: order.order_number,
    },
  };

  return {
    event_id: eventId,
    event_type: 'Purchase',
    shopify_id: String(order.id),
    reddit_payload: redditEvent,
  };
}

export function transformCheckoutToLead(checkout) {
  const eventId = `shopify_checkout_${checkout.id}_${Date.now()}`;

  const redditEvent = {
    event_at: checkout.created_at,
    event_type: {
      tracking_type: 'Lead',
    },
    user: {
      email: hashEmail(checkout.email),
      ...(checkout.phone && { phone_number: hashPhone(checkout.phone) }),
    },
    event_metadata: {
      value: Math.round((parseFloat(checkout.total_price) || 0) * 100),
      conversion_id: String(checkout.id),
      currency: checkout.currency,
    },
  };

  return {
    event_id: eventId,
    event_type: 'Lead',
    shopify_id: String(checkout.id),
    reddit_payload: redditEvent,
  };
}

关键细节: Reddit 的 API 要求 value 为整数,表示最小货币单位的金额(美元为美分,英镑为便士等)。这就是我们乘以100并取整的原因。

src/services/deduplication.js

import pg from 'pg';
import { config } from '../config.js';
import { logger } from '../utils/logger.js';

const { Pool } = pg;

const pool = new Pool(config.database);

export async function isDuplicate(eventId) {
  const result = await pool.query(
    'SELECT event_id FROM events WHERE event_id = $1',
    [eventId]
  );

  return result.rows.length > 0;
}

export async function saveEvent(eventData) {
  const {
    event_id,
    event_type,
    shopify_id,
    shopify_payload,
    reddit_payload,
  } = eventData;

  await pool.query(
    `INSERT INTO events (event_id, event_type, shopify_id, shopify_payload, reddit_payload, status)
     VALUES ($1, $2, $3, $4, $5, 'pending')`,
    [
      event_id,
      event_type,
      shopify_id,
      JSON.stringify(shopify_payload),
      JSON.stringify(reddit_payload),
    ]
  );

  logger.info('Event saved to database', { event_id, event_type });
}

export async function markEventSent(eventId, response) {
  await pool.query(
    `UPDATE events
     SET status = 'sent', sent_at = NOW(), reddit_response = $2
     WHERE event_id = $1`,
    [eventId, JSON.stringify(response)]
  );
}

export async function markEventFailed(eventId, error) {
  await pool.query(
    `UPDATE events
     SET status = 'failed', error_message = $2
     WHERE event_id = $1`,
    [eventId, error.message]
  );
}

src/services/redditCapi.js

import { config } from '../config.js';
import { logger } from '../utils/logger.js';

export async function sendToReddit(eventData) {
  const payload = {
    events: [eventData],
  };

  logger.info('Sending event to Reddit CAPI', {
    tracking_type: eventData.event_type.tracking_type,
    conversion_id: eventData.event_metadata.conversion_id,
  });

  // Retry logic: 3 attempts with exponential backoff
  let lastError;

  for (let attempt = 1; attempt <= 3; attempt++) {
    try {
      const response = await fetch(config.reddit.apiUrl, {
        method: 'POST',
        headers: {
          'Authorization': `Bearer ${config.reddit.accessToken}`,
          'Content-Type': 'application/json',
        },
        body: JSON.stringify(payload),
      });

      const responseText = await response.text();

      if (response.ok) {
        logger.info('Reddit CAPI success', {
          status: response.status,
          attempt,
        });

        return {
          success: true,
          status: response.status,
          response: responseText,
        };
      }

      // Handle rate limiting
      if (response.status === 429) {
        const retryAfter = parseInt(response.headers.get('Retry-After') || '60');
        logger.warn(`Rate limited by Reddit, retry after ${retryAfter}s`);

        await sleep(retryAfter * 1000);
        continue;
      }

      // Log error but retry
      logger.error('Reddit API error', {
        status: response.status,
        response: responseText,
        attempt,
      });

      lastError = new Error(`Reddit API error: ${response.status} - ${responseText}`);

      // Exponential backoff: 1s, 2s, 4s
      if (attempt < 3) {
        await sleep(Math.pow(2, attempt - 1) * 1000);
      }

    } catch (error) {
      logger.error('Network error sending to Reddit', {
        error: error.message,
        attempt,
      });

      lastError = error;

      if (attempt < 3) {
        await sleep(Math.pow(2, attempt - 1) * 1000);
      }
    }
  }

  throw lastError;
}

function sleep(ms) {
  return new Promise(resolve => setTimeout(resolve, ms));
}

src/routes/webhooks.js

import express from 'express';
import { logger } from '../utils/logger.js';
import * as shopifyTransformer from '../services/shopifyTransformer.js';
import * as deduplication from '../services/deduplication.js';
import * as redditCapi from '../services/redditCapi.js';

const router = express.Router();

router.post('/webhooks/shopify', async (req, res) => {
  const topic = req.get('X-Shopify-Topic');
  const shopDomain = req.get('X-Shopify-Shop-Domain');

  logger.info('Received Shopify webhook', { topic, shopDomain });

  try {
    let eventData;

    // Route based on webhook topic
    switch (topic) {
      case 'orders/create':
        eventData = shopifyTransformer.transformOrderToPurchase(req.body);
        break;

      case 'checkouts/create':
        eventData = shopifyTransformer.transformCheckoutToLead(req.body);
        break;

      default:
        logger.info('Unsupported webhook topic', { topic });
        return res.status(200).json({
          status: 'ignored',
          message: `Topic ${topic} not supported`,
        });
    }

    // Check for duplicates
    if (await deduplication.isDuplicate(eventData.event_id)) {
      logger.info('Duplicate event detected', { event_id: eventData.event_id });
      return res.status(200).json({ status: 'duplicate' });
    }

    // Save to database
    await deduplication.saveEvent({
      ...eventData,
      shopify_payload: req.body,
    });

    // Send to Reddit CAPI
    try {
      const result = await redditCapi.sendToReddit(eventData.reddit_payload);

      await deduplication.markEventSent(eventData.event_id, result);

      return res.status(200).json({
        status: 'success',
        event_id: eventData.event_id,
      });

    } catch (error) {
      await deduplication.markEventFailed(eventData.event_id, error);

      // Still return 200 to Shopify so it doesn't retry
      return res.status(200).json({
        status: 'failed',
        event_id: eventData.event_id,
        error: error.message,
      });
    }

  } catch (error) {
    logger.error('Error processing webhook', { error: error.message });

    return res.status(500).json({
      error: 'Internal server error',
      message: error.message,
    });
  }
});

export default router;

src/routes/health.js

import express from 'express';
import pg from 'pg';
import { config } from '../config.js';

const router = express.Router();
const { Pool } = pg;
const pool = new Pool(config.database);

router.get('/health', async (req, res) => {
  try {
    // Check database connection
    await pool.query('SELECT 1');

    res.json({
      status: 'healthy',
      timestamp: new Date().toISOString(),
      uptime: process.uptime(),
      database: 'connected',
    });
  } catch (error) {
    res.status(503).json({
      status: 'unhealthy',
      error: error.message,
    });
  }
});

export default router;

src/index.js

import express from 'express';
import { config } from './config.js';
import { logger } from './utils/logger.js';
import { rawBodyCapture, verifyShopifyWebhook } from './middleware/shopifyVerify.js';
import healthRouter from './routes/health.js';
import webhooksRouter from './routes/webhooks.js';

const app = express();

// Health check (no authentication)
app.use('/', healthRouter);

// Shopify webhooks (with HMAC verification)
// Important: rawBodyCapture must come before verifyShopifyWebhook
app.use('/webhooks/shopify', rawBodyCapture);
app.use('/webhooks/shopify', verifyShopifyWebhook);
app.use('/', webhooksRouter);

const server = app.listen(config.port, () => {
  logger.info(`Server running on port ${config.port}`);
});

// Graceful shutdown
process.on('SIGTERM', () => {
  logger.info('SIGTERM received, closing server');
  server.close(() => {
    logger.info('Server closed');
    process.exit(0);
  });
});

步骤 3:Docker 配置

Dockerfile

FROM node:18-alpine

WORKDIR /app

# Install dependencies
COPY package*.json ./
RUN npm ci --only=production

# Copy application code
COPY src ./src

# Run as non-root user
USER node

EXPOSE 3000

CMD ["node", "src/index.js"]

docker-compose.yml

version: '3.8'

services:
  reddit-capi:
    build: .
    ports:
      # Bind only to localhost - not publicly accessible
      - "127.0.0.1:3000:3000"
    environment:
      - PORT=3000
      - SHOPIFY_WEBHOOK_SECRET=${SHOPIFY_WEBHOOK_SECRET}
      - REDDIT_PIXEL_ID=${REDDIT_PIXEL_ID}
      - REDDIT_ACCESS_TOKEN=${REDDIT_ACCESS_TOKEN}
      - DB_HOST=postgres
      - DB_PORT=5432
      - DB_NAME=reddit_capi
      - DB_USER=reddit_capi
      - DB_PASSWORD=${DB_PASSWORD}
    depends_on:
      - postgres
    restart: unless-stopped
    healthcheck:
      test: ["CMD", "wget", "--quiet", "--tries=1", "--spider", "http://localhost:3000/health"]
      interval: 30s
      timeout: 10s
      retries: 3

  postgres:
    image: postgres:16-alpine
    environment:
      - POSTGRES_DB=reddit_capi
      - POSTGRES_USER=reddit_capi
      - POSTGRES_PASSWORD=${DB_PASSWORD}
    volumes:
      - pgdata:/var/lib/postgresql/data
      - ./init.sql:/docker-entrypoint-initdb.d/init.sql
    restart: unless-stopped

volumes:
  pgdata:

init.sql

CREATE TABLE IF NOT EXISTS events (
  event_id VARCHAR(255) PRIMARY KEY,
  event_type VARCHAR(50) NOT NULL,
  shopify_id VARCHAR(255) NOT NULL,
  shopify_payload JSONB,
  reddit_payload JSONB,
  reddit_response JSONB,
  created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
  sent_at TIMESTAMP,
  status VARCHAR(20) DEFAULT 'pending',
  error_message TEXT
);

CREATE INDEX idx_events_status ON events(status);
CREATE INDEX idx_events_created_at ON events(created_at);
CREATE INDEX idx_events_shopify_id ON events(shopify_id);

.env

# Shopify Configuration
SHOPIFY_WEBHOOK_SECRET=your_shopify_webhook_secret_here

# Reddit Configuration
REDDIT_PIXEL_ID=t2_your_pixel_id
REDDIT_ACCESS_TOKEN=your_reddit_access_token_here

# Database Configuration
DB_PASSWORD=generate_strong_password_here

安全提示: 为 .env 文件设置限制性权限:

chmod 600 .env

步骤 4:Nginx 反向代理

创建 Nginx 配置文件:

# /etc/nginx/sites-available/shopify-events.yourdomain.com.conf

server {
    listen 80;
    server_name shopify-events.yourdomain.com;

    # Let's Encrypt ACME challenge
    location /.well-known/acme-challenge/ {
        root /var/www/html;
    }

    # Redirect all other HTTP traffic to HTTPS
    location / {
        return 301 https://$host$request_uri;
    }
}

启用站点:

ln -s /etc/nginx/sites-available/shopify-events.yourdomain.com.conf /etc/nginx/sites-enabled/
nginx -t
systemctl reload nginx

步骤 5:SSL 证书

如果尚未安装 Certbot,请先安装:

apt update
apt install certbot

创建 webroot 目录:

mkdir -p /var/www/html/.well-known/acme-challenge

生成证书:

certbot certonly --webroot \
  -w /var/www/html \
  -d shopify-events.yourdomain.com

更新 Nginx 配置以使用 SSL:

# Add this server block to /etc/nginx/sites-available/shopify-events.yourdomain.com.conf

server {
    listen 443 ssl http2;
    server_name shopify-events.yourdomain.com;

    ssl_certificate /etc/letsencrypt/live/shopify-events.yourdomain.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/shopify-events.yourdomain.com/privkey.pem;

    ssl_protocols TLSv1.2 TLSv1.3;
    ssl_ciphers HIGH:!aNULL:!MD5;
    ssl_prefer_server_ciphers on;
    ssl_session_cache shared:SSL:10m;
    ssl_session_timeout 10m;

    location / {
        proxy_pass http://127.0.0.1:3000;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto https;

        proxy_http_version 1.1;
        proxy_set_header Connection "";

        # Timeouts for webhook processing
        proxy_connect_timeout 60s;
        proxy_send_timeout 60s;
        proxy_read_timeout 60s;
    }
}

测试并重新加载:

nginx -t
systemctl reload nginx

验证 HTTPS 是否正常工作:

curl -vI https://shopify-events.yourdomain.com/health

您应该看到:

  • HTTP/2 200
  • 有效的 SSL 证书
  • 包含 {"status":"healthy"} 的 JSON 响应

步骤 6:Shopify Webhook 配置

启动应用程序

cd /opt/reddit-capi
docker compose up -d

查看日志:

docker compose logs -f reddit-capi

您应该看到:Server running on port 3000

在 Shopify 中配置 Webhook

  1. 前往 Shopify 管理后台 → 设置 → 通知
  2. 向下滚动到“Webhooks”
  3. 点击“创建 webhook”

订单创建 Webhook:

  • 事件:Order creation
  • 格式:JSON
  • URL:https://shopify-events.yourdomain.com/webhooks/shopify
  • API 版本:最新版本(2025-01 或更新版本)

点击“保存 webhook”

复制 Webhook 签名密钥:

创建 webhook 后,Shopify 会显示签名密钥。复制它并添加到您的 .env 文件中:

echo 'SHOPIFY_WEBHOOK_SECRET=your_actual_secret_here' >> /opt/reddit-capi/.env

重启容器以应用新密钥:

docker compose restart reddit-capi

可选:结账创建 Webhook

如果您想在 Reddit 中将弃购追踪为“Lead”事件,请重复上述流程创建“Checkout creation”事件。


步骤 7:测试和验证

测试无有效签名的 HMAC

curl https://shopify-events.yourdomain.com/webhooks/shopify

预期响应:

{"error":"Unauthorized - Missing HMAC"}

这确认安全机制正在工作。

从 Shopify 发送测试 Webhook

  1. 在 Shopify 管理后台,前往您创建的 webhook
  2. 点击“发送测试通知”
  3. 查看日志:
docker compose logs -f reddit-capi

您应该看到:

INFO: Webhook HMAC verified {"topic":"orders/create"}
INFO: Received Shopify webhook {"topic":"orders/create","shopDomain":"your-store.myshopify.com"}
INFO: Event saved to database {"event_id":"shopify_order_...","event_type":"Purchase"}
INFO: Sending event to Reddit CAPI {"tracking_type":"Purchase","conversion_id":"..."}
INFO: Reddit CAPI success {"status":200,"attempt":1}

在 Reddit Events Manager 中验证

  1. 前往 Reddit Ads Manager → Events Manager
  2. 点击您的像素
  3. 前往“Test Events”标签
  4. 您应该在几分钟内看到测试事件出现

检查数据库

docker compose exec postgres psql -U reddit_capi -d reddit_capi -c \
  "SELECT event_id, event_type, status, sent_at FROM events ORDER BY created_at DESC LIMIT 5;"

您应该看到测试事件的 status = 'sent'

创建真实测试订单

使用 Shopify 的“测试订单”功能或在开发商店中创建真实订单。Webhook 将自动触发,您将在15-30分钟内在 Reddit 中看到转化数据。


常见问题排查

问题:事件未在 Reddit 中显示

检查日志中的 Reddit API 错误:

docker compose logs reddit-capi | grep -i "reddit api error"

常见错误:

错误原因解决方案
“unexpected type number”value 以小数而非整数发送确认您正在乘以100并使用 Math.round()
“unknown field event_id”在 Reddit 有效载荷中发送了 event_idredditEvent 中移除 event_id。改用 conversion_id
“missing required field”缺少 event_atevent_type 或用户数据检查转换器是否设置了所有必填字段
401 Unauthorized访问令牌无效在 Reddit Events Manager 中重新生成令牌
429 Too Many Requests速率限制等待 retry-after 时间(已自动处理)

问题:Shopify Webhook 失败

检查 Shopify webhook 状态:

在 Shopify 管理后台 → 设置 → 通知 → Webhooks 中,检查 webhook 是否显示错误。

常见问题:

  • HMAC 验证失败:密钥不匹配。从 Shopify 复制正确的密钥并更新 .env
  • 超时:服务器未在5秒内响应。检查容器是否正在运行且状态健康
  • SSL 错误:证书无效。使用 curl -vI https://your-domain.com/health 验证

问题:“stream is not readable”

rawBodyCaptureexpress.json() 同时尝试读取请求体时会出现此错误。

解决方案: 从 webhook 路由中移除 express.json() 中间件。rawBodyCapture 函数同时处理原始请求体捕获和 JSON 解析。

错误做法:

app.use('/webhooks/shopify', express.json());  // ❌ Don't use this
app.use('/webhooks/shopify', rawBodyCapture);

正确做法:

app.use('/webhooks/shopify', rawBodyCapture);  // ✅ Handles both

问题:重复转化

如果您同时使用 Reddit Pixel 和 CAPI,请确保两者使用相同的 conversion_id

Pixel 实现(在 Shopify 感谢页面中):

rdt('track', 'Purchase', {
  transactionId: '{{ order.id }}',  // Must match conversion_id in CAPI
  value: {{ order.total_price }},
  currency: '{{ order.currency }}'
});

Reddit 会自动对具有相同 conversion_id 的事件进行去重。

问题:数据库连接失败

# Check if PostgreSQL is running
docker compose ps postgres

# Test connection
docker compose exec postgres pg_isready -U reddit_capi

# Check logs
docker compose logs postgres

监控和维护

日志监控

实时日志:

docker compose logs -f reddit-capi

搜索错误:

docker compose logs reddit-capi | grep -i error

按事件类型筛选:

docker compose logs reddit-capi | grep "Purchase"

数据库查询

最近事件:

docker compose exec postgres psql -U reddit_capi -d reddit_capi -c \
  "SELECT event_id, event_type, status, created_at, sent_at
   FROM events
   ORDER BY created_at DESC
   LIMIT 20;"

失败事件:

docker compose exec postgres psql -U reddit_capi -d reddit_capi -c \
  "SELECT event_id, event_type, error_message, created_at
   FROM events
   WHERE status = 'failed'
   ORDER BY created_at DESC;"

事件统计:

docker compose exec postgres psql -U reddit_capi -d reddit_capi -c \
  "SELECT event_type, status, COUNT(*) as count
   FROM events
   GROUP BY event_type, status
   ORDER BY event_type, status;"

健康检查

设置 cron 作业来监控健康检查端点:

*/5 * * * * curl -sf https://shopify-events.yourdomain.com/health > /dev/null || echo "Reddit CAPI health check failed" | mail -s "Alert" your@email.com

SSL 证书续期

Let’s Encrypt 证书有效期为90天。Certbot 应会自动续期。

测试续期:

certbot renew --dry-run

检查证书到期时间:

certbot certificates

容器更新

代码更改后更新:

docker compose down
docker compose up -d --build

查看容器状态:

docker compose ps

无需重新构建即可重启:

docker compose restart reddit-capi

高级配置

自定义事件类型

Reddit 支持额外的标准事件。您可以扩展转换器来处理更多 Shopify webhook:

浏览内容(产品浏览):

// Add to shopifyTransformer.js
export function transformProductView(product) {
  return {
    event_at: new Date().toISOString(),
    event_type: {
      tracking_type: 'ViewContent',
    },
    user: { /* user data */ },
    event_metadata: {
      conversion_id: `product_${product.id}_${Date.now()}`,
      item_id: String(product.id),
      value: Math.round(parseFloat(product.price) * 100),
      currency: 'USD',
    },
  };
}

加入购物车:

case 'carts/update':
  eventData = transformCartUpdate(req.body);
  break;

产品级跟踪

在事件元数据中包含产品详情:

event_metadata: {
  value: Math.round((parseFloat(order.total_price) || 0) * 100),
  conversion_id: String(order.id),
  currency: order.currency,

  // Product details
  products: order.line_items.map(item => ({
    id: String(item.product_id),
    name: item.title,
    quantity: item.quantity,
    price: Math.round(parseFloat(item.price) * 100),
  })),
},

多币种支持

该实现已通过 Shopify 的 order.currency 字段处理多币种。请确保始终正确地将金额转换为最小货币单位:

  • USD、EUR、CAD:乘以100(美分)
  • JPY、KRW:无需乘法(已是最小单位)
  • BHD、KWD:乘以1000(3位小数)

对于小数位数不同的货币,更新转换器:

function getMinorUnitMultiplier(currency) {
  const zeroDecimalCurrencies = ['JPY', 'KRW', 'CLP'];
  const threeDecimalCurrencies = ['BHD', 'KWD', 'OMR'];

  if (zeroDecimalCurrencies.includes(currency)) return 1;
  if (threeDecimalCurrencies.includes(currency)) return 1000;
  return 100;
}

// In transformer:
value: Math.round(parseFloat(order.total_price) * getMinorUnitMultiplier(order.currency))

下一步

此实现提供了 Reddit Ads 的生产就绪服务器端转化跟踪。该架构设计注重可靠性,包含 HMAC 验证、去重、重试逻辑和全面的日志记录。

您可以在此基础上扩展:

  • 额外的事件类型:产品浏览、弃购、搜索
  • 增强用户匹配:包含来自 URL 参数的 Reddit 点击 ID
  • 告警:与监控工具集成以处理失败事件
  • 分析仪表盘:从 PostgreSQL 可视化转化数据

此处演示的完整源代码和 Docker 配置基于处理实时 Shopify 订单的生产部署作为参考。