Compare commits
10 Commits
df4c49c3ef
...
d7c82a0da0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d7c82a0da0 | ||
|
|
73079f743a | ||
|
|
c46d53b261 | ||
|
|
ca6e4cdf5d | ||
|
|
c64fbbb072 | ||
|
|
5eddddc8ec | ||
|
|
8881b6933b | ||
|
|
d7ca5e451d | ||
|
|
fc7d07ae6b | ||
|
|
1f306fffd7 |
@@ -1 +0,0 @@
|
||||
python 3.12.3
|
||||
83
AGENTS.txt
Normal file
83
AGENTS.txt
Normal file
@@ -0,0 +1,83 @@
|
||||
# AGENTS.txt - Development Preferences and Patterns
|
||||
|
||||
This file documents preferences and patterns for AI agents working on this project.
|
||||
|
||||
## CLI Command Alias Preferences
|
||||
|
||||
When implementing CLI commands, follow these patterns:
|
||||
|
||||
### Command Aliases
|
||||
- Prefer short aliases for common commands:
|
||||
- `list` → `ls` (Unix-style listing)
|
||||
- `add` → `a` (quick task creation)
|
||||
- `edit` → `e` (quick editing)
|
||||
- `complete` → `c`, `done` (task completion)
|
||||
- `delete` → `rm`, `del` (Unix-style removal)
|
||||
- `open` → `o` (quick opening)
|
||||
- `show` → `view`, `s` (viewing details)
|
||||
|
||||
### Option/Flag Aliases
|
||||
- Use single letter flags where possible:
|
||||
- `--due-date` → `-d`
|
||||
- `--project` → `-p`
|
||||
- `--priority` → `-pr`
|
||||
- `--tag` → `-t`
|
||||
- `--all` → `-a`
|
||||
- `--force` → `-f`
|
||||
- `--browser` → `-b`
|
||||
- `--content` → `-c`
|
||||
- `--limit` → `-l`
|
||||
|
||||
### Design Principles
|
||||
- Follow Unix CLI conventions where applicable
|
||||
- Provide both full and abbreviated forms for all commands
|
||||
- Single-letter aliases for the most frequently used operations
|
||||
- Intuitive mappings (e.g., `rm` for delete, `ls` for list)
|
||||
- Consistent patterns across different modules
|
||||
|
||||
## Project Structure Patterns
|
||||
|
||||
### Services
|
||||
- Place API clients in `src/services/<service_name>/`
|
||||
- Include authentication in `auth.py`
|
||||
- Main client logic in `client.py`
|
||||
- Exports in `__init__.py`
|
||||
|
||||
### CLI Commands
|
||||
- Group related commands in `src/cli/<service_name>.py`
|
||||
- Register with main CLI in `src/cli/__init__.py`
|
||||
- Use Click for command framework
|
||||
|
||||
### Utilities
|
||||
- Shared utilities in `src/utils/`
|
||||
- Service-specific utilities in `src/utils/<service_name>_utils.py`
|
||||
|
||||
### Token Storage
|
||||
- Store auth tokens in `~/.local/share/gtd-terminal-tools/`
|
||||
- Use project name consistently across services
|
||||
|
||||
## TickTick Integration Notes
|
||||
|
||||
### Authentication
|
||||
- Uses OAuth2 flow with client credentials
|
||||
- Tokens cached in `~/.local/share/gtd-terminal-tools/ticktick_tokens.json`
|
||||
- Environment variables: `TICKTICK_CLIENT_ID`, `TICKTICK_CLIENT_SECRET`, `TICKTICK_REDIRECT_URI`
|
||||
|
||||
### Command Usage Examples
|
||||
```bash
|
||||
# List tasks
|
||||
ticktick ls -p "Work" # List by project
|
||||
ticktick ls -d today # List by due date
|
||||
ticktick ls -pr high # List by priority
|
||||
|
||||
# Task operations
|
||||
ticktick a "Buy groceries" -d tomorrow -p "Personal"
|
||||
ticktick e 123 --title "Updated task"
|
||||
ticktick c 123 # Complete task
|
||||
ticktick rm 123 -f # Force delete task
|
||||
ticktick o 123 # Open in browser/app
|
||||
```
|
||||
|
||||
## Future Development
|
||||
|
||||
When adding new services or commands, follow these established patterns for consistency.
|
||||
250
GODSPEED_SYNC.md
Normal file
250
GODSPEED_SYNC.md
Normal file
@@ -0,0 +1,250 @@
|
||||
# Godspeed Sync
|
||||
|
||||
A two-way synchronization tool between the Godspeed task management API and local markdown files.
|
||||
|
||||
## Features
|
||||
|
||||
- **Bidirectional Sync**: Download tasks from Godspeed to markdown files, edit locally, and upload changes back
|
||||
- **Directory Structure**: Creates a clean directory structure matching your Godspeed lists
|
||||
- **ID Tracking**: Uses hidden HTML comments to track task IDs even when you rearrange tasks
|
||||
- **Markdown Format**: Simple `- [ ] Task name <!-- id:abc123 -->` format for easy editing
|
||||
- **Completion Status**: Supports incomplete `[ ]`, completed `[x]`, and cancelled `[-]` checkboxes
|
||||
- **Notes Support**: Task notes are preserved and synced
|
||||
- **CLI Interface**: Easy-to-use command line interface with shortcuts
|
||||
|
||||
## Installation
|
||||
|
||||
The Godspeed sync is part of the GTD Terminal Tools project. Make sure you have the required dependencies:
|
||||
|
||||
```bash
|
||||
# Install dependencies
|
||||
uv sync
|
||||
|
||||
# Or with pip
|
||||
pip install requests click
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### Option 1: Environment Variables
|
||||
```bash
|
||||
export GODSPEED_EMAIL="your@email.com"
|
||||
export GODSPEED_PASSWORD="your-password"
|
||||
|
||||
# OR use an API token directly
|
||||
export GODSPEED_TOKEN="your-api-token"
|
||||
|
||||
# Optional: Custom sync directory
|
||||
export GODSPEED_SYNC_DIR="~/Documents/MyTasks"
|
||||
|
||||
# Optional: Disable SSL verification for corporate networks
|
||||
export GODSPEED_DISABLE_SSL_VERIFY="true"
|
||||
```
|
||||
|
||||
### Option 2: Interactive Setup
|
||||
The tool will prompt for credentials if not provided via environment variables.
|
||||
|
||||
### Getting an API Token
|
||||
You can get your API token from the Godspeed desktop app:
|
||||
1. Open the Command Palette (Cmd/Ctrl + Shift + P)
|
||||
2. Run "Copy API access token"
|
||||
3. Use this token with `GODSPEED_TOKEN` environment variable
|
||||
|
||||
## Usage
|
||||
|
||||
### Basic Commands
|
||||
|
||||
```bash
|
||||
# Download all tasks from Godspeed to local markdown files
|
||||
python -m src.cli.godspeed download
|
||||
# OR use the short alias
|
||||
python -m src.cli.godspeed download # 'gs' will be available when integrated
|
||||
|
||||
# Upload local changes back to Godspeed
|
||||
python -m src.cli.godspeed upload
|
||||
|
||||
# Bidirectional sync (download then upload)
|
||||
python -m src.cli.godspeed sync
|
||||
|
||||
# Check sync status
|
||||
python -m src.cli.godspeed status
|
||||
|
||||
# Open sync directory in file manager
|
||||
python -m src.cli.godspeed open
|
||||
|
||||
# Test connection and SSL (helpful for corporate networks)
|
||||
python -m src.cli.godspeed test-connection
|
||||
```
|
||||
|
||||
### Workflow Example
|
||||
|
||||
1. **Initial sync**:
|
||||
```bash
|
||||
python -m src.cli.godspeed download
|
||||
```
|
||||
|
||||
2. **Edit tasks locally**:
|
||||
Open the generated markdown files in your favorite editor:
|
||||
```
|
||||
~/Documents/Godspeed/
|
||||
├── Personal.md
|
||||
├── Work_Projects.md
|
||||
└── Shopping.md
|
||||
```
|
||||
|
||||
3. **Make changes**:
|
||||
```markdown
|
||||
# Personal.md
|
||||
- [ ] Call dentist <!-- id:abc123 -->
|
||||
- [x] Buy groceries <!-- id:def456 -->
|
||||
Don't forget milk and eggs
|
||||
- [-] Old project <!-- id:ghi789 -->
|
||||
- [ ] New task I just added <!-- id:jkl012 -->
|
||||
|
||||
# Work_Projects.md
|
||||
- [ ] Finish quarterly report <!-- id:xyz890 -->
|
||||
Due Friday
|
||||
- [-] Cancelled meeting <!-- id:uvw567 -->
|
||||
```
|
||||
|
||||
4. **Sync changes back**:
|
||||
```bash
|
||||
python -m src.cli.godspeed upload
|
||||
```
|
||||
|
||||
## File Format
|
||||
|
||||
Each list becomes a markdown file with tasks in this format:
|
||||
|
||||
```markdown
|
||||
- [ ] Incomplete task <!-- id:abc123 -->
|
||||
- [x] Completed task <!-- id:def456 -->
|
||||
- [X] Also completed (capital X works too) <!-- id:ghi789 -->
|
||||
- [-] Cancelled/cleared task <!-- id:jkl012 -->
|
||||
- [ ] Task with notes <!-- id:mno345 -->
|
||||
Notes go on the next line, indented
|
||||
```
|
||||
|
||||
### Important Notes:
|
||||
- **Don't remove the `<!-- id:xxx -->` comments** - they're used to track tasks
|
||||
- **Don't worry about the IDs** - they're auto-generated for new tasks
|
||||
- **Checkbox format matters**:
|
||||
- Use `[ ]` for incomplete tasks
|
||||
- Use `[x]` or `[X]` for completed tasks
|
||||
- Use `[-]` for cancelled/cleared tasks
|
||||
- **Completion status syncs both ways**:
|
||||
- Check/uncheck boxes in markdown → syncs to Godspeed
|
||||
- Mark complete/incomplete/cleared in Godspeed → syncs to markdown
|
||||
- **Completed/cancelled tasks are hidden**: When downloading from Godspeed, only incomplete tasks appear in local files (keeps them clean)
|
||||
- **Notes are optional** - indent them under the task line
|
||||
- **File names** correspond to list names (special characters replaced with underscores)
|
||||
|
||||
## Directory Structure
|
||||
|
||||
By default, files are synced to:
|
||||
- `~/Documents/Godspeed/` (if Documents folder exists)
|
||||
- `~/.local/share/gtd-terminal-tools/godspeed/` (fallback)
|
||||
|
||||
Each Godspeed list becomes a `.md` file:
|
||||
- "Personal" → `Personal.md`
|
||||
- "Work Projects" → `Work_Projects.md`
|
||||
- "Shopping List" → `Shopping_List.md`
|
||||
|
||||
## Sync Metadata
|
||||
|
||||
The tool stores sync metadata in `.godspeed_metadata.json`:
|
||||
```json
|
||||
{
|
||||
"task_mapping": {
|
||||
"local-id-1": "godspeed-task-id-1",
|
||||
"local-id-2": "godspeed-task-id-2"
|
||||
},
|
||||
"list_mapping": {
|
||||
"Personal": "godspeed-list-id-1",
|
||||
"Work Projects": "godspeed-list-id-2"
|
||||
},
|
||||
"last_sync": "2024-01-15T10:30:00"
|
||||
}
|
||||
```
|
||||
|
||||
## API Rate Limits
|
||||
|
||||
Godspeed has rate limits:
|
||||
- **Listing**: 10 requests/minute, 200/hour
|
||||
- **Creating/Updating**: 60 requests/minute, 1,000/hour
|
||||
|
||||
The sync tool respects these limits and handles errors gracefully.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### SSL/Corporate Network Issues
|
||||
If you're getting SSL certificate errors on a corporate network:
|
||||
|
||||
```bash
|
||||
# Test the connection first
|
||||
python -m src.cli.godspeed test-connection
|
||||
|
||||
# If SSL errors occur, bypass SSL verification
|
||||
export GODSPEED_DISABLE_SSL_VERIFY=true
|
||||
python -m src.cli.godspeed test-connection
|
||||
```
|
||||
|
||||
### Authentication Issues
|
||||
```bash
|
||||
# Clear stored credentials
|
||||
rm ~/.local/share/gtd-terminal-tools/godspeed_config.json
|
||||
|
||||
# Use token instead of password
|
||||
export GODSPEED_TOKEN="your-token-here"
|
||||
```
|
||||
|
||||
### Sync Issues
|
||||
```bash
|
||||
# Check current status
|
||||
python -m src.cli.godspeed status
|
||||
|
||||
# Verify sync directory
|
||||
ls ~/Documents/Godspeed/
|
||||
|
||||
# Check metadata
|
||||
cat ~/.local/share/gtd-terminal-tools/godspeed/.godspeed_metadata.json
|
||||
```
|
||||
|
||||
### Common Problems
|
||||
|
||||
1. **"List ID not found"**: New lists created locally will put tasks in your Inbox
|
||||
2. **"Task not found"**: Tasks deleted in Godspeed won't sync back
|
||||
3. **Duplicate tasks**: Don't manually copy task lines between files (IDs must be unique)
|
||||
|
||||
## Development
|
||||
|
||||
### Testing
|
||||
Run the test suite:
|
||||
```bash
|
||||
python test_godspeed_sync.py
|
||||
```
|
||||
|
||||
### File Structure
|
||||
```
|
||||
src/services/godspeed/
|
||||
├── __init__.py # Package init
|
||||
├── client.py # Godspeed API client
|
||||
├── sync.py # Sync engine
|
||||
└── config.py # Configuration management
|
||||
|
||||
src/cli/
|
||||
└── godspeed.py # CLI interface
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
This is part of the larger GTD Terminal Tools project. When contributing:
|
||||
|
||||
1. Follow the existing code style
|
||||
2. Add tests for new functionality
|
||||
3. Update this README for user-facing changes
|
||||
4. Test with the mock data before real API calls
|
||||
|
||||
## License
|
||||
|
||||
Same as the parent GTD Terminal Tools project.
|
||||
281
README.md
281
README.md
@@ -0,0 +1,281 @@
|
||||
# luk
|
||||
|
||||
> Pronounced "look" - as in "look at your Outlook data locally"
|
||||
|
||||
A CLI tool for syncing Microsoft Outlook email, calendar, and tasks to local file-based formats like Maildir and vdir. Use your favorite terminal tools to manage your email and calendar.
|
||||
|
||||
## Features
|
||||
|
||||
- **Email Synchronization**: Sync emails with Microsoft Graph API to local Maildir format
|
||||
- **Calendar Management**: Two-way calendar sync with vdir/ICS support
|
||||
- **Task Integration**: Sync with Godspeed and TickTick task managers
|
||||
- **TUI Dashboard**: Interactive terminal dashboard for monitoring sync progress
|
||||
- **Daemon Mode**: Background daemon with proper Unix logging
|
||||
- **Cross-Platform**: Works on macOS, Linux, and Windows
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Python 3.12 or higher
|
||||
- `uv` package manager (recommended)
|
||||
|
||||
### Installation
|
||||
|
||||
```bash
|
||||
# Clone the repository
|
||||
git clone https://github.com/timothybendt/luk.git
|
||||
cd luk
|
||||
|
||||
# Run the installation script
|
||||
./install.sh
|
||||
```
|
||||
|
||||
### Manual Installation
|
||||
|
||||
```bash
|
||||
# Create virtual environment
|
||||
python3 -m venv .venv
|
||||
source .venv/bin/activate
|
||||
|
||||
# Install dependencies
|
||||
pip install -e .
|
||||
|
||||
# Setup configuration directories
|
||||
mkdir -p ~/.config/luk
|
||||
mkdir -p ~/.local/share/luk
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
Create a configuration file at `~/.config/luk/config.env`:
|
||||
|
||||
```bash
|
||||
# Microsoft Graph settings
|
||||
MICROSOFT_CLIENT_ID=your_client_id
|
||||
MICROSOFT_TENANT_ID=your_tenant_id
|
||||
|
||||
# Email settings
|
||||
MAILDIR_PATH=~/Mail
|
||||
NOTES_DIR=~/Documents/Notes
|
||||
|
||||
# Godspeed settings
|
||||
GODSPEED_EMAIL=your_email@example.com
|
||||
GODSPEED_PASSWORD=your_password
|
||||
GODSPEED_TOKEN=your_token
|
||||
GODSPEED_SYNC_DIR=~/Documents/Godspeed
|
||||
|
||||
# TickTick settings
|
||||
TICKTICK_CLIENT_ID=your_client_id
|
||||
TICKTICK_CLIENT_SECRET=your_client_secret
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### Basic Commands
|
||||
|
||||
```bash
|
||||
# Show help
|
||||
luk --help
|
||||
|
||||
# Run sync with default settings
|
||||
luk sync run
|
||||
|
||||
# Run with TUI dashboard
|
||||
luk sync run --dashboard
|
||||
|
||||
# Start daemon mode
|
||||
luk sync run --daemon
|
||||
|
||||
# Stop daemon
|
||||
luk sync stop
|
||||
|
||||
# Check daemon status
|
||||
luk sync status
|
||||
```
|
||||
|
||||
### Sync Options
|
||||
|
||||
```bash
|
||||
# Dry run (no changes)
|
||||
luk sync run --dry-run
|
||||
|
||||
# Specify organization
|
||||
luk sync run --org mycompany
|
||||
|
||||
# Enable notifications
|
||||
luk sync run --notify
|
||||
|
||||
# Download attachments
|
||||
luk sync run --download-attachments
|
||||
|
||||
# Two-way calendar sync
|
||||
luk sync run --two-way-calendar
|
||||
|
||||
# Custom calendar directory
|
||||
luk sync run --vdir ~/Calendars
|
||||
```
|
||||
|
||||
### Dashboard Mode
|
||||
|
||||
The TUI dashboard provides real-time monitoring of sync operations:
|
||||
|
||||
- **Status Display**: Current sync status and metrics
|
||||
- **Progress Bars**: Visual progress for each sync component
|
||||
- **Activity Log**: Scrollable log of all sync activities
|
||||
- **Keyboard Shortcuts**:
|
||||
- `q`: Quit dashboard
|
||||
- `l`: Toggle log visibility
|
||||
- `r`: Refresh status
|
||||
|
||||
### Daemon Mode
|
||||
|
||||
Run luk as a background daemon with proper Unix logging:
|
||||
|
||||
```bash
|
||||
# Start daemon
|
||||
luk sync run --daemon
|
||||
|
||||
# Check status
|
||||
luk sync status
|
||||
|
||||
# View logs
|
||||
cat ~/.local/share/luk/luk.log
|
||||
|
||||
# Stop daemon
|
||||
luk sync stop
|
||||
```
|
||||
|
||||
Daemon logs are stored at `~/.local/share/luk/luk.log` with automatic rotation.
|
||||
|
||||
## Architecture
|
||||
|
||||
### Core Components
|
||||
|
||||
- **Sync Engine**: Handles email, calendar, and task synchronization
|
||||
- **TUI Dashboard**: Interactive monitoring interface using Textual
|
||||
- **Daemon Service**: Background service with logging and process management
|
||||
- **Configuration**: Environment-based configuration system
|
||||
|
||||
### Directory Structure
|
||||
|
||||
```
|
||||
src/
|
||||
├── cli/ # CLI commands and interfaces
|
||||
│ ├── sync.py # Main sync command
|
||||
│ ├── sync_dashboard.py # TUI dashboard
|
||||
│ ├── sync_daemon.py # Daemon service
|
||||
│ └── ...
|
||||
├── services/ # External service integrations
|
||||
│ ├── microsoft_graph/ # Microsoft Graph API
|
||||
│ ├── godspeed/ # Godspeed task manager
|
||||
│ ├── ticktick/ # TickTick API
|
||||
│ └── ...
|
||||
└── utils/ # Utility functions
|
||||
```
|
||||
|
||||
## Development
|
||||
|
||||
### Setup Development Environment
|
||||
|
||||
```bash
|
||||
# Clone repository
|
||||
git clone https://github.com/timothybendt/luk.git
|
||||
cd luk
|
||||
|
||||
# Install development dependencies
|
||||
uv sync --dev
|
||||
|
||||
# Run tests
|
||||
uv run pytest
|
||||
|
||||
# Run linting
|
||||
uv run ruff check .
|
||||
uv run ruff format .
|
||||
|
||||
# Type checking
|
||||
uv run mypy src/
|
||||
```
|
||||
|
||||
### Project Structure
|
||||
|
||||
- `pyproject.toml`: Project configuration and dependencies
|
||||
- `src/cli/`: CLI commands and user interfaces
|
||||
- `src/services/`: External service integrations
|
||||
- `src/utils/`: Shared utilities and helpers
|
||||
- `tests/`: Test suite
|
||||
|
||||
### Building for Distribution
|
||||
|
||||
```bash
|
||||
# Build package
|
||||
uv run build
|
||||
|
||||
# Check package
|
||||
uv run twine check dist/*
|
||||
|
||||
# Upload to PyPI (for maintainers)
|
||||
uv run twine upload dist/*
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **Authentication Errors**: Ensure Microsoft Graph credentials are properly configured
|
||||
2. **Permission Denied**: Check file permissions for Maildir and calendar directories
|
||||
3. **Daemon Not Starting**: Verify log directory exists and is writable
|
||||
4. **TUI Not Rendering**: Ensure terminal supports Textual requirements
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable debug logging:
|
||||
|
||||
```bash
|
||||
export LOG_LEVEL=DEBUG
|
||||
luk sync run --dry-run
|
||||
```
|
||||
|
||||
### Log Files
|
||||
|
||||
- **Daemon Logs**: `~/.local/share/luk/luk.log`
|
||||
- **Sync State**: `~/.local/share/luk/sync_state.json`
|
||||
- **Configuration**: `~/.config/luk/`
|
||||
|
||||
## Contributing
|
||||
|
||||
1. Fork the repository
|
||||
2. Create a feature branch
|
||||
3. Make your changes
|
||||
4. Add tests for new functionality
|
||||
5. Run the test suite
|
||||
6. Submit a pull request
|
||||
|
||||
### Code Style
|
||||
|
||||
This project uses:
|
||||
- **Ruff** for linting and formatting
|
||||
- **MyPy** for type checking
|
||||
- **Black** for code formatting
|
||||
- **Pre-commit** hooks for quality control
|
||||
|
||||
## License
|
||||
|
||||
MIT License - see LICENSE file for details.
|
||||
|
||||
## Support
|
||||
|
||||
- **Issues**: [GitHub Issues](https://github.com/timothybendt/luk/issues)
|
||||
- **Documentation**: [GitHub Wiki](https://github.com/timothybendt/luk/wiki)
|
||||
- **Discussions**: [GitHub Discussions](https://github.com/timothybendt/luk/discussions)
|
||||
|
||||
## Changelog
|
||||
|
||||
### v0.1.0
|
||||
- Initial release
|
||||
- Email synchronization with Microsoft Graph
|
||||
- Calendar sync with vdir/ICS support
|
||||
- Godspeed and TickTick integration
|
||||
- TUI dashboard
|
||||
- Daemon mode with logging
|
||||
- Cross-platform support
|
||||
|
||||
134
TASK_SWEEPER.md
Normal file
134
TASK_SWEEPER.md
Normal file
@@ -0,0 +1,134 @@
|
||||
# Task Sweeper for Godspeed
|
||||
|
||||
A utility script to consolidate scattered incomplete tasks from markdown files into your Godspeed Inbox.
|
||||
|
||||
## Purpose
|
||||
|
||||
If you have notes scattered across directories (like `2024/`, `2025/`, project folders, etc.) with incomplete tasks in markdown format, this script will:
|
||||
|
||||
1. **Find all incomplete tasks** (`- [ ] Task name`) in markdown files
|
||||
2. **Move them** to your Godspeed `Inbox.md` file
|
||||
3. **Preserve completed/cancelled tasks** in their original locations
|
||||
4. **Add source tracking** so you know where each task came from
|
||||
5. **Clean up original files** by removing only the incomplete tasks
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
# Dry run to see what would happen
|
||||
python sweep_tasks.py ~/Documents/Notes ~/Documents/Godspeed --dry-run
|
||||
|
||||
# Actually perform the sweep
|
||||
python sweep_tasks.py ~/Documents/Notes ~/Documents/Godspeed
|
||||
|
||||
# Sweep from current directory
|
||||
python sweep_tasks.py . ./godspeed
|
||||
```
|
||||
|
||||
## Example Workflow
|
||||
|
||||
**Before sweeping:**
|
||||
```
|
||||
~/Documents/Notes/
|
||||
├── 2024/
|
||||
│ ├── projects/website.md
|
||||
│ │ ├── - [x] Create wireframes
|
||||
│ │ ├── - [ ] Design mockups ← Will be swept
|
||||
│ │ └── - [ ] Get approval ← Will be swept
|
||||
│ └── notes/meeting.md
|
||||
│ ├── - [ ] Update docs ← Will be swept
|
||||
│ └── - [x] Fix bug (completed)
|
||||
├── 2025/
|
||||
│ └── goals.md
|
||||
│ └── - [ ] Launch feature ← Will be swept
|
||||
└── random-notes.md
|
||||
└── - [ ] Call dentist ← Will be swept
|
||||
```
|
||||
|
||||
**After sweeping:**
|
||||
```
|
||||
~/Documents/Godspeed/
|
||||
└── Inbox.md ← All incomplete tasks here
|
||||
├── - [ ] Design mockups
|
||||
│ From: 2024/projects/website.md
|
||||
├── - [ ] Get approval
|
||||
│ From: 2024/projects/website.md
|
||||
├── - [ ] Update docs
|
||||
│ From: 2024/notes/meeting.md
|
||||
├── - [ ] Launch feature
|
||||
│ From: 2025/goals.md
|
||||
└── - [ ] Call dentist
|
||||
From: random-notes.md
|
||||
|
||||
~/Documents/Notes/
|
||||
├── 2024/
|
||||
│ ├── projects/website.md ← Only completed tasks remain
|
||||
│ │ └── - [x] Create wireframes
|
||||
│ └── notes/meeting.md
|
||||
│ └── - [x] Fix bug (completed)
|
||||
├── 2025/
|
||||
│ └── goals.md ← File cleaned/deleted if empty
|
||||
└── random-notes.md ← File cleaned/deleted if empty
|
||||
```
|
||||
|
||||
## Features
|
||||
|
||||
- **Safe Operation**: Always use `--dry-run` first to preview changes
|
||||
- **Source Tracking**: Each swept task includes a note about its origin
|
||||
- **Selective Processing**: Only moves incomplete tasks, preserves completed ones
|
||||
- **Smart Cleanup**: Removes empty files or keeps non-task content
|
||||
- **Godspeed Integration**: Creates properly formatted tasks with IDs for sync
|
||||
- **Recursive Search**: Finds markdown files in all subdirectories
|
||||
- **Exclusion Logic**: Skips the Godspeed directory itself and hidden files
|
||||
|
||||
## Integration with Godspeed Sync
|
||||
|
||||
After sweeping tasks:
|
||||
|
||||
1. **Review** the consolidated tasks in `Inbox.md`
|
||||
2. **Upload to API**: Run `python -m src.cli godspeed upload`
|
||||
3. **Organize in Godspeed**: Move tasks from Inbox to appropriate lists
|
||||
4. **Sync back**: Run `python -m src.cli godspeed sync` to get organized structure
|
||||
|
||||
## Safety Features
|
||||
|
||||
- **Dry run mode** shows exactly what will happen without making changes
|
||||
- **Backup recommendation**: The script modifies files, so backup your notes first
|
||||
- **Preserve content**: Non-task content (headings, notes, etc.) remains in original files
|
||||
- **Completed task preservation**: `[x]` and `[-]` tasks stay where they are
|
||||
- **Error handling**: Graceful handling of unreadable files or parsing errors
|
||||
|
||||
## Example Output
|
||||
|
||||
```
|
||||
🧹 Sweeping incomplete tasks from: /Users/you/Documents/Notes
|
||||
📥 Target Inbox: /Users/you/Documents/Godspeed/Inbox.md
|
||||
🔍 Dry run: False
|
||||
============================================================
|
||||
|
||||
📁 Found 8 markdown files to process
|
||||
|
||||
📄 Processing: 2024/projects/website.md
|
||||
🔄 Found 2 incomplete tasks:
|
||||
• Design mockups
|
||||
• Get client approval
|
||||
✅ Keeping 1 completed/cleared tasks in place
|
||||
✂️ Cleaned file (removed tasks): 2024/projects/website.md
|
||||
|
||||
📥 Writing 6 tasks to Inbox...
|
||||
✅ Inbox updated: /Users/you/Documents/Godspeed/Inbox.md
|
||||
|
||||
============================================================
|
||||
📊 SWEEP SUMMARY:
|
||||
• Files processed: 3
|
||||
• Tasks swept: 6
|
||||
• Target: /Users/you/Documents/Godspeed/Inbox.md
|
||||
|
||||
🎉 Successfully swept 6 tasks!
|
||||
💡 Next steps:
|
||||
1. Review tasks in: /Users/you/Documents/Godspeed/Inbox.md
|
||||
2. Run 'godspeed upload' to sync to API
|
||||
3. Organize tasks into appropriate lists in Godspeed app
|
||||
```
|
||||
|
||||
This tool is perfect for periodic "note cleanup" sessions where you consolidate scattered tasks into your main GTD system.
|
||||
237
TICKTICK_SETUP.md
Normal file
237
TICKTICK_SETUP.md
Normal file
@@ -0,0 +1,237 @@
|
||||
# TickTick CLI Integration Setup
|
||||
|
||||
This guide helps you set up the TickTick CLI integration for task management.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
1. **TickTick Account**: You need a TickTick account
|
||||
2. **TickTick Developer App**: Register an app at https://developer.ticktick.com/docs#/openapi
|
||||
|
||||
## Setup Steps
|
||||
|
||||
### 1. Register TickTick Developer App
|
||||
|
||||
1. Go to https://developer.ticktick.com/docs#/openapi
|
||||
2. Click "Manage Apps" in the top right
|
||||
3. Click "+App Name" to create a new app
|
||||
4. Fill in the app name (required field only)
|
||||
5. Note down your `Client ID` and `Client Secret`
|
||||
6. Set the OAuth Redirect URL to: `http://localhost:8080`
|
||||
|
||||
### 2. Set Environment Variables
|
||||
|
||||
Add these to your shell profile (`.bashrc`, `.zshrc`, etc.):
|
||||
|
||||
```bash
|
||||
# OAuth2 Credentials (Required)
|
||||
export TICKTICK_CLIENT_ID="your_client_id_here"
|
||||
export TICKTICK_CLIENT_SECRET="your_client_secret_here"
|
||||
export TICKTICK_REDIRECT_URI="http://localhost:8080"
|
||||
|
||||
# TickTick Login Credentials (Optional - you'll be prompted if not set)
|
||||
export TICKTICK_USERNAME="your_email@example.com"
|
||||
export TICKTICK_PASSWORD="your_password"
|
||||
|
||||
# SSL Configuration (Optional - for corporate networks with MITM proxies)
|
||||
# export TICKTICK_DISABLE_SSL_VERIFY="true"
|
||||
```
|
||||
|
||||
**Important Note**: The TickTick library requires both OAuth2 credentials AND your regular TickTick login credentials. This is how the library is designed:
|
||||
- **OAuth2**: Used for API authentication and authorization
|
||||
- **Username/Password**: Required for initial session establishment
|
||||
|
||||
Your login credentials are only used for authentication and are not stored permanently.
|
||||
|
||||
## Authentication
|
||||
|
||||
### Token Storage
|
||||
|
||||
OAuth tokens are automatically cached in:
|
||||
```
|
||||
~/.local/share/gtd-terminal-tools/ticktick_tokens.json
|
||||
```
|
||||
|
||||
This file is created and managed automatically by the TickTick library. The tokens are used to avoid repeated OAuth flows and will be refreshed automatically when needed.
|
||||
|
||||
### Authentication Status
|
||||
|
||||
Check your authentication setup and token status:
|
||||
|
||||
```bash
|
||||
ticktick auth-status
|
||||
```
|
||||
|
||||
This command shows:
|
||||
- OAuth credentials status (environment variables)
|
||||
- Login credentials status
|
||||
- Token cache status and expiration
|
||||
- Token file location and last modified time
|
||||
|
||||
If you need to clear the token cache and re-authenticate:
|
||||
|
||||
```bash
|
||||
ticktick clear-cache
|
||||
```
|
||||
|
||||
### 3. Install Dependencies
|
||||
|
||||
```bash
|
||||
uv sync
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### Basic Commands
|
||||
|
||||
```bash
|
||||
# List all tasks
|
||||
ticktick list
|
||||
ticktick ls # Short alias
|
||||
|
||||
# Filter by project
|
||||
ticktick ls -p "Work"
|
||||
|
||||
# Filter by due date
|
||||
ticktick ls -d today
|
||||
ticktick ls -d tomorrow
|
||||
ticktick ls -d "2024-01-15"
|
||||
|
||||
# Add a new task
|
||||
ticktick add "Buy groceries"
|
||||
ticktick a "Buy groceries" -d tomorrow -p "Personal" # With options
|
||||
|
||||
# Edit a task
|
||||
ticktick edit TASK_ID --title "New title"
|
||||
ticktick e TASK_ID -d tomorrow -pr high
|
||||
|
||||
# Complete a task
|
||||
ticktick complete TASK_ID
|
||||
ticktick done TASK_ID
|
||||
ticktick c TASK_ID # Short alias
|
||||
|
||||
# Delete a task
|
||||
ticktick delete TASK_ID
|
||||
ticktick rm TASK_ID -f # Force delete without confirmation
|
||||
|
||||
# Open task in browser/app
|
||||
ticktick open TASK_ID
|
||||
ticktick o TASK_ID # Short alias
|
||||
|
||||
# Show detailed task info
|
||||
ticktick show TASK_ID
|
||||
ticktick s TASK_ID # Short alias
|
||||
|
||||
# List projects and tags
|
||||
ticktick projects
|
||||
ticktick tags
|
||||
|
||||
# Sync with TickTick servers
|
||||
ticktick sync
|
||||
```
|
||||
|
||||
### Command Aliases Reference
|
||||
|
||||
| Full Command | Short Alias | Description |
|
||||
|--------------|-------------|-------------|
|
||||
| `list` | `ls` | List tasks |
|
||||
| `add` | `a` | Add new task |
|
||||
| `edit` | `e` | Edit existing task |
|
||||
| `complete` | `c`, `done` | Mark task complete |
|
||||
| `delete` | `rm`, `del` | Delete task |
|
||||
| `open` | `o` | Open in browser/app |
|
||||
| `show` | `s`, `view` | Show task details |
|
||||
| `projects` | `proj` | List projects |
|
||||
|
||||
### Option Aliases
|
||||
|
||||
| Full Option | Short | Description |
|
||||
|-------------|-------|-------------|
|
||||
| `--project` | `-p` | Filter/set project |
|
||||
| `--due-date` | `-d` | Filter/set due date |
|
||||
| `--priority` | `-pr` | Filter/set priority |
|
||||
| `--tag` | `-t` | Filter by tag |
|
||||
| `--all` | `-a` | Show all tasks |
|
||||
| `--force` | `-f` | Skip confirmations |
|
||||
| `--browser` | `-b` | Force browser opening |
|
||||
| `--content` | `-c` | Task description |
|
||||
| `--limit` | `-l` | Limit results |
|
||||
|
||||
### Priority Levels
|
||||
|
||||
You can set priorities using numbers (0-5) or names:
|
||||
- `0` or `none`: No priority
|
||||
- `1` or `low`: Low priority
|
||||
- `2` or `medium`: Medium priority
|
||||
- `3` or `high`: High priority
|
||||
- `4` or `urgent`: Very high priority
|
||||
- `5` or `critical`: Critical priority
|
||||
|
||||
### Date Formats
|
||||
|
||||
Supported date formats:
|
||||
- `today`, `tomorrow`, `yesterday`
|
||||
- `YYYY-MM-DD` (e.g., `2024-01-15`)
|
||||
- Most common date formats via dateutil parsing
|
||||
|
||||
## Authentication
|
||||
|
||||
The TickTick integration uses a **dual authentication approach**:
|
||||
|
||||
1. **OAuth2 Setup**: On first use, the CLI will:
|
||||
- Open a web browser for OAuth authorization
|
||||
- Prompt you to copy the redirect URL
|
||||
- Cache the OAuth token in `~/.local/share/gtd-terminal-tools/ticktick_tokens.json`
|
||||
|
||||
2. **Login Credentials**: The library also requires your TickTick username/password for session establishment. You can either:
|
||||
- Set `TICKTICK_USERNAME` and `TICKTICK_PASSWORD` environment variables
|
||||
- Enter them when prompted (they won't be stored)
|
||||
|
||||
The OAuth token cache lasts about 6 months, after which you'll need to re-authenticate.
|
||||
|
||||
**Why Both?**: The `ticktick-py` library uses OAuth2 for API calls but requires login credentials for initial session setup. This is the library's design, not a limitation of our CLI.
|
||||
|
||||
## macOS App Integration
|
||||
|
||||
On macOS, the `ticktick open` command will try to open tasks in the TickTick desktop app first, falling back to the browser if the app isn't available.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### "Please set TICKTICK_CLIENT_ID" Error
|
||||
Make sure you've set the environment variables and restarted your terminal.
|
||||
|
||||
### Authentication Issues
|
||||
Try clearing the token cache:
|
||||
```bash
|
||||
rm ~/.local/share/gtd-terminal-tools/ticktick_tokens.json
|
||||
```
|
||||
|
||||
### SSL Certificate Errors
|
||||
If you get SSL certificate verification errors (common on corporate networks with MITM proxies):
|
||||
|
||||
```bash
|
||||
export TICKTICK_DISABLE_SSL_VERIFY="true"
|
||||
```
|
||||
|
||||
**Warning**: This disables SSL verification. Only use this on trusted corporate networks.
|
||||
|
||||
### Network/API Errors
|
||||
Check your internet connection and verify your TickTick credentials.
|
||||
|
||||
## Example Workflow
|
||||
|
||||
```bash
|
||||
# Morning routine: check today's tasks
|
||||
ticktick ls -d today
|
||||
|
||||
# Add a quick task
|
||||
ticktick a "Review reports" -p "Work" -d today -pr high
|
||||
|
||||
# Complete a task when done
|
||||
ticktick c TASK_ID
|
||||
|
||||
# Check what's due tomorrow
|
||||
ticktick ls -d tomorrow
|
||||
|
||||
# Open an important task for details
|
||||
ticktick o TASK_ID
|
||||
```
|
||||
@@ -1,315 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Benchmark script to compare two approaches for updating envelopes list in maildir_gtd.
|
||||
This script compares:
|
||||
1. Using .pop() to remove items from ListView
|
||||
2. Using refresh_list_view() to rebuild the entire ListView
|
||||
|
||||
It tests with different numbers of envelopes (100, 1000, 2000) and measures:
|
||||
- Time to remove a single item
|
||||
- Time to remove multiple items in sequence
|
||||
- Memory usage
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
import time
|
||||
import random
|
||||
import gc
|
||||
import tracemalloc
|
||||
from datetime import datetime, timedelta, UTC
|
||||
from typing import List, Dict, Any, Callable, Tuple
|
||||
import json
|
||||
|
||||
# Add parent directory to path so we can import modules correctly
|
||||
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
# Import required classes and functions
|
||||
from textual.widgets import ListView, ListItem, Label
|
||||
from textual.app import App, ComposeResult
|
||||
from textual.containers import Vertical
|
||||
|
||||
# Import our application's modules
|
||||
from maildir_gtd.app import MessageStore
|
||||
from maildir_gtd.utils import group_envelopes_by_date
|
||||
|
||||
# Mock class to simulate the ListView behavior
|
||||
class MockListView:
|
||||
def __init__(self):
|
||||
self.items = []
|
||||
self.index = 0
|
||||
|
||||
def append(self, item):
|
||||
self.items.append(item)
|
||||
|
||||
def pop(self, idx=None):
|
||||
if idx is None:
|
||||
return self.items.pop()
|
||||
return self.items.pop(idx)
|
||||
|
||||
def clear(self):
|
||||
self.items = []
|
||||
|
||||
def __len__(self):
|
||||
return len(self.items)
|
||||
|
||||
# Helper functions to generate test data
|
||||
def generate_envelope(idx: int) -> Dict[str, Any]:
|
||||
"""Generate a synthetic envelope with predictable data."""
|
||||
now = datetime.now(UTC)
|
||||
# Distribute dates over the last 60 days to create realistic grouping
|
||||
date = now - timedelta(days=random.randint(0, 60),
|
||||
hours=random.randint(0, 23),
|
||||
minutes=random.randint(0, 59))
|
||||
|
||||
return {
|
||||
"id": str(idx),
|
||||
"subject": f"Test Subject {idx}",
|
||||
"from": {"addr": f"sender{idx}@example.com"},
|
||||
"to": {"addr": f"recipient{idx}@example.com"},
|
||||
"date": date.strftime("%Y-%m-%d %H:%M"),
|
||||
"cc": {},
|
||||
"type": "message"
|
||||
}
|
||||
|
||||
def generate_test_envelopes(count: int) -> List[Dict[str, Any]]:
|
||||
"""Generate a specified number of test envelopes."""
|
||||
return [generate_envelope(i) for i in range(1, count + 1)]
|
||||
|
||||
# Benchmark functions
|
||||
def benchmark_pop_approach(store: MessageStore, list_view: MockListView, indices_to_remove: List[int]) -> float:
|
||||
"""Benchmark the .pop() approach."""
|
||||
start_time = time.time()
|
||||
|
||||
for idx in sorted(indices_to_remove, reverse=True): # Remove from highest to lowest to avoid index shifting issues
|
||||
msg_id = int(store.envelopes[idx]["id"])
|
||||
store.remove(msg_id)
|
||||
list_view.pop(idx)
|
||||
|
||||
end_time = time.time()
|
||||
return end_time - start_time
|
||||
|
||||
def benchmark_refresh_approach(store: MessageStore, list_view: MockListView, indices_to_remove: List[int]) -> float:
|
||||
"""Benchmark the refresh_list_view approach."""
|
||||
start_time = time.time()
|
||||
|
||||
for idx in indices_to_remove:
|
||||
msg_id = int(store.envelopes[idx]["id"])
|
||||
store.remove(msg_id)
|
||||
|
||||
# Simulate refresh_list_view by clearing and rebuilding the list
|
||||
list_view.clear()
|
||||
for item in store.envelopes:
|
||||
if item and item.get("type") == "header":
|
||||
list_view.append(f"Header: {item['label']}")
|
||||
elif item: # Check if not None
|
||||
list_view.append(f"Email: {item.get('subject', '')}")
|
||||
|
||||
end_time = time.time()
|
||||
return end_time - start_time
|
||||
|
||||
def run_memory_benchmark(func, *args):
|
||||
"""Run a function with memory tracking."""
|
||||
tracemalloc.start()
|
||||
result = func(*args)
|
||||
current, peak = tracemalloc.get_traced_memory()
|
||||
tracemalloc.stop()
|
||||
return result, current, peak
|
||||
|
||||
def run_benchmark(envelope_count: int, num_operations: int = 10):
|
||||
"""Run benchmarks for a specific number of envelopes."""
|
||||
print(f"\n{'=' * 50}")
|
||||
print(f"Running benchmark with {envelope_count} envelopes")
|
||||
print(f"{'=' * 50}")
|
||||
|
||||
# Generate test data
|
||||
envelopes = generate_test_envelopes(envelope_count)
|
||||
|
||||
# Set up for pop approach
|
||||
pop_store = MessageStore()
|
||||
pop_store.load(envelopes.copy())
|
||||
pop_list_view = MockListView()
|
||||
|
||||
# Build initial list view
|
||||
for item in pop_store.envelopes:
|
||||
if item and item.get("type") == "header":
|
||||
pop_list_view.append(f"Header: {item['label']}")
|
||||
elif item:
|
||||
pop_list_view.append(f"Email: {item.get('subject', '')}")
|
||||
|
||||
# Set up for refresh approach
|
||||
refresh_store = MessageStore()
|
||||
refresh_store.load(envelopes.copy())
|
||||
refresh_list_view = MockListView()
|
||||
|
||||
# Build initial list view
|
||||
for item in refresh_store.envelopes:
|
||||
if item and item.get("type") == "header":
|
||||
refresh_list_view.append(f"Header: {item['label']}")
|
||||
elif item:
|
||||
refresh_list_view.append(f"Email: {item.get('subject', '')}")
|
||||
|
||||
# Generate random indices to remove (ensure they're valid message indices, not headers)
|
||||
valid_indices = []
|
||||
for idx, item in enumerate(pop_store.envelopes):
|
||||
if item and item.get("type") != "header" and item is not None:
|
||||
valid_indices.append(idx)
|
||||
|
||||
if len(valid_indices) < num_operations:
|
||||
num_operations = len(valid_indices)
|
||||
print(f"Warning: Only {num_operations} valid messages available for removal")
|
||||
|
||||
indices_to_remove = random.sample(valid_indices, num_operations)
|
||||
|
||||
# Single operation benchmark
|
||||
print("\n🔹 Single operation benchmark (removing 1 item):")
|
||||
|
||||
# Pop approach - single operation
|
||||
gc.collect() # Ensure clean state
|
||||
single_pop_time, pop_current, pop_peak = run_memory_benchmark(
|
||||
benchmark_pop_approach, pop_store, pop_list_view, [indices_to_remove[0]]
|
||||
)
|
||||
print(f" Pop approach: {single_pop_time*1000:.2f} ms (Memory - Current: {pop_current/1024:.1f} KB, Peak: {pop_peak/1024:.1f} KB)")
|
||||
|
||||
# Refresh approach - single operation
|
||||
gc.collect() # Ensure clean state
|
||||
single_refresh_time, refresh_current, refresh_peak = run_memory_benchmark(
|
||||
benchmark_refresh_approach, refresh_store, refresh_list_view, [indices_to_remove[0]]
|
||||
)
|
||||
print(f" Refresh approach: {single_refresh_time*1000:.2f} ms (Memory - Current: {refresh_current/1024:.1f} KB, Peak: {refresh_peak/1024:.1f} KB)")
|
||||
|
||||
# Determine which is better for single operation
|
||||
if single_pop_time < single_refresh_time:
|
||||
print(f" 🥇 Pop is {single_refresh_time/single_pop_time:.1f}x faster for single operation")
|
||||
else:
|
||||
print(f" 🥇 Refresh is {single_pop_time/single_refresh_time:.1f}x faster for single operation")
|
||||
|
||||
# Reset for multi-operation benchmark
|
||||
gc.collect()
|
||||
pop_store = MessageStore()
|
||||
pop_store.load(envelopes.copy())
|
||||
pop_list_view = MockListView()
|
||||
for item in pop_store.envelopes:
|
||||
if item and item.get("type") == "header":
|
||||
pop_list_view.append(f"Header: {item['label']}")
|
||||
elif item:
|
||||
pop_list_view.append(f"Email: {item.get('subject', '')}")
|
||||
|
||||
refresh_store = MessageStore()
|
||||
refresh_store.load(envelopes.copy())
|
||||
refresh_list_view = MockListView()
|
||||
for item in refresh_store.envelopes:
|
||||
if item and item.get("type") == "header":
|
||||
refresh_list_view.append(f"Header: {item['label']}")
|
||||
elif item:
|
||||
refresh_list_view.append(f"Email: {item.get('subject', '')}")
|
||||
|
||||
# Multiple operations benchmark
|
||||
print(f"\n🔹 Multiple operations benchmark (removing {num_operations} items):")
|
||||
|
||||
# Pop approach - multiple operations
|
||||
gc.collect()
|
||||
multi_pop_time, pop_current, pop_peak = run_memory_benchmark(
|
||||
benchmark_pop_approach, pop_store, pop_list_view, indices_to_remove
|
||||
)
|
||||
print(f" Pop approach: {multi_pop_time*1000:.2f} ms (Memory - Current: {pop_current/1024:.1f} KB, Peak: {pop_peak/1024:.1f} KB)")
|
||||
|
||||
# Refresh approach - multiple operations
|
||||
gc.collect()
|
||||
multi_refresh_time, refresh_current, refresh_peak = run_memory_benchmark(
|
||||
benchmark_refresh_approach, refresh_store, refresh_list_view, indices_to_remove
|
||||
)
|
||||
print(f" Refresh approach: {multi_refresh_time*1000:.2f} ms (Memory - Current: {refresh_current/1024:.1f} KB, Peak: {refresh_peak/1024:.1f} KB)")
|
||||
|
||||
# Determine which is better for multiple operations
|
||||
if multi_pop_time < multi_refresh_time:
|
||||
print(f" 🥇 Pop is {multi_refresh_time/multi_pop_time:.1f}x faster for multiple operations")
|
||||
else:
|
||||
print(f" 🥇 Refresh is {multi_pop_time/multi_refresh_time:.1f}x faster for multiple operations")
|
||||
|
||||
return {
|
||||
"envelope_count": envelope_count,
|
||||
"num_operations": num_operations,
|
||||
"single_operation": {
|
||||
"pop_time_ms": single_pop_time * 1000,
|
||||
"refresh_time_ms": single_refresh_time * 1000,
|
||||
"pop_memory_kb": pop_peak / 1024,
|
||||
"refresh_memory_kb": refresh_peak / 1024
|
||||
},
|
||||
"multiple_operations": {
|
||||
"pop_time_ms": multi_pop_time * 1000,
|
||||
"refresh_time_ms": multi_refresh_time * 1000,
|
||||
"pop_memory_kb": pop_peak / 1024,
|
||||
"refresh_memory_kb": refresh_peak / 1024
|
||||
}
|
||||
}
|
||||
|
||||
def main():
|
||||
print("\n📊 MAILDIR GTD LIST UPDATE BENCHMARK 📊")
|
||||
print("Comparing .pop() vs refresh_list_view() approaches")
|
||||
print("=" * 60)
|
||||
|
||||
# Define test cases
|
||||
envelope_counts = [100, 1000, 2000]
|
||||
results = []
|
||||
|
||||
for count in envelope_counts:
|
||||
result = run_benchmark(count)
|
||||
results.append(result)
|
||||
|
||||
# Print summary
|
||||
print("\n" + "=" * 60)
|
||||
print("📊 BENCHMARK SUMMARY")
|
||||
print("=" * 60)
|
||||
|
||||
# Console table formatting
|
||||
print(f"{'Size':<10} | {'Single Op (pop)':<15} | {'Single Op (refresh)':<20} | {'Multi Op (pop)':<15} | {'Multi Op (refresh)':<20}")
|
||||
print("-" * 90)
|
||||
|
||||
for result in results:
|
||||
count = result["envelope_count"]
|
||||
single_pop = f"{result['single_operation']['pop_time_ms']:.2f} ms"
|
||||
single_refresh = f"{result['single_operation']['refresh_time_ms']:.2f} ms"
|
||||
multi_pop = f"{result['multiple_operations']['pop_time_ms']:.2f} ms"
|
||||
multi_refresh = f"{result['multiple_operations']['refresh_time_ms']:.2f} ms"
|
||||
|
||||
print(f"{count:<10} | {single_pop:<15} | {single_refresh:<20} | {multi_pop:<15} | {multi_refresh:<20}")
|
||||
|
||||
# Display conclusions
|
||||
print("\n🔍 CONCLUSIONS:")
|
||||
for result in results:
|
||||
count = result["envelope_count"]
|
||||
single_ratio = result['single_operation']['refresh_time_ms'] / result['single_operation']['pop_time_ms']
|
||||
multi_ratio = result['multiple_operations']['refresh_time_ms'] / result['multiple_operations']['pop_time_ms']
|
||||
|
||||
print(f"\nFor {count} envelopes:")
|
||||
|
||||
if single_ratio > 1:
|
||||
print(f"- Single operation: .pop() is {single_ratio:.1f}x faster")
|
||||
else:
|
||||
print(f"- Single operation: refresh_list_view() is {1/single_ratio:.1f}x faster")
|
||||
|
||||
if multi_ratio > 1:
|
||||
print(f"- Multiple operations: .pop() is {multi_ratio:.1f}x faster")
|
||||
else:
|
||||
print(f"- Multiple operations: refresh_list_view() is {1/multi_ratio:.1f}x faster")
|
||||
|
||||
print("\n🔑 RECOMMENDATION:")
|
||||
# Calculate average performance difference across all tests
|
||||
avg_single_ratio = sum(r['single_operation']['refresh_time_ms'] / r['single_operation']['pop_time_ms'] for r in results) / len(results)
|
||||
avg_multi_ratio = sum(r['multiple_operations']['refresh_time_ms'] / r['multiple_operations']['pop_time_ms'] for r in results) / len(results)
|
||||
|
||||
if avg_single_ratio > 1 and avg_multi_ratio > 1:
|
||||
print("The .pop() approach is generally faster, but consider the following:")
|
||||
print("- .pop() risks index misalignment issues with the message_store")
|
||||
print("- refresh_list_view() ensures UI and data structure stay synchronized")
|
||||
print("- The performance difference may not be noticeable to users")
|
||||
print("👉 Recommendation: Use refresh_list_view() for reliability unless performance becomes a real issue")
|
||||
else:
|
||||
print("The refresh_list_view() approach is not only safer but also performs competitively:")
|
||||
print("- It ensures perfect synchronization between UI and data model")
|
||||
print("- It eliminates the risk of index misalignment")
|
||||
print("👉 Recommendation: Use refresh_list_view() approach as it's more reliable and performs well")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
88
check_env.py
Executable file
88
check_env.py
Executable file
@@ -0,0 +1,88 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Environment validation script for GTD Terminal Tools."""
|
||||
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path for imports
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
|
||||
from src.utils.platform import validate_environment
|
||||
|
||||
|
||||
def main():
|
||||
"""Run environment validation and exit with appropriate code."""
|
||||
env_info = validate_environment()
|
||||
|
||||
print("GTD Terminal Tools - Environment Validation")
|
||||
print("=" * 50)
|
||||
|
||||
# Platform info
|
||||
platform_info = env_info["platform_info"]
|
||||
print(f"Platform: {platform_info['system']} {platform_info['release']}")
|
||||
print(
|
||||
f"Python: {platform_info['python_version']} ({platform_info['python_implementation']})"
|
||||
)
|
||||
print(f"Supported: {'✓' if env_info['platform_supported'] else '✗'}")
|
||||
print()
|
||||
|
||||
# Dependencies
|
||||
print("Dependencies:")
|
||||
all_deps_available = True
|
||||
for dep, available in env_info["dependencies"].items():
|
||||
status = "✓" if available else "✗"
|
||||
print(f" {dep}: {status}")
|
||||
if not available:
|
||||
all_deps_available = False
|
||||
print()
|
||||
|
||||
# Terminal compatibility
|
||||
print("Terminal Compatibility:")
|
||||
terminal_ok = True
|
||||
for feature, supported in env_info["terminal_compatibility"].items():
|
||||
status = "✓" if supported else "✗"
|
||||
print(f" {feature}: {status}")
|
||||
if not supported and feature in ["color_support", "textual_support"]:
|
||||
terminal_ok = False
|
||||
print()
|
||||
|
||||
# Directories
|
||||
print("Directories:")
|
||||
for dir_type, dir_path in [
|
||||
("config", "config_dir"),
|
||||
("data", "data_dir"),
|
||||
("logs", "log_dir"),
|
||||
]:
|
||||
path = Path(env_info[dir_path])
|
||||
exists = path.exists()
|
||||
status = "✓" if exists else "✗"
|
||||
print(f" {dir_type.capitalize()}: {env_info[dir_path]} {status}")
|
||||
print()
|
||||
|
||||
# Recommendations
|
||||
if env_info["recommendations"]:
|
||||
print("Recommendations:")
|
||||
for rec in env_info["recommendations"]:
|
||||
print(f" • {rec}")
|
||||
print()
|
||||
|
||||
# Overall status
|
||||
platform_ok = env_info["platform_supported"]
|
||||
overall_ok = platform_ok and all_deps_available and terminal_ok
|
||||
|
||||
if overall_ok:
|
||||
print("✓ Environment is ready for GTD Terminal Tools")
|
||||
sys.exit(0)
|
||||
else:
|
||||
print("✗ Environment has issues that need to be addressed")
|
||||
if not platform_ok:
|
||||
print(" - Unsupported platform or Python version")
|
||||
if not all_deps_available:
|
||||
print(" - Missing dependencies")
|
||||
if not terminal_ok:
|
||||
print(" - Terminal compatibility issues")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
129
debug_ticktick.py
Executable file
129
debug_ticktick.py
Executable file
@@ -0,0 +1,129 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Debug script to test TickTick authentication in isolation
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import logging
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path
|
||||
sys.path.insert(0, str(Path(__file__).parent / "src"))
|
||||
|
||||
# Enable debug logging
|
||||
logging.basicConfig(level=logging.DEBUG, format="%(levelname)s: %(message)s")
|
||||
|
||||
# Set SSL bypass for corporate networks
|
||||
os.environ["TICKTICK_DISABLE_SSL_VERIFY"] = "true"
|
||||
|
||||
# Set your credentials here for testing
|
||||
TEST_CLIENT_ID = input("Enter your TICKTICK_CLIENT_ID: ").strip()
|
||||
TEST_CLIENT_SECRET = input("Enter your TICKTICK_CLIENT_SECRET: ").strip()
|
||||
TEST_USERNAME = input("Enter your TickTick username/email: ").strip()
|
||||
|
||||
import getpass
|
||||
|
||||
TEST_PASSWORD = getpass.getpass("Enter your TickTick password: ")
|
||||
|
||||
if not all([TEST_CLIENT_ID, TEST_CLIENT_SECRET, TEST_USERNAME, TEST_PASSWORD]):
|
||||
print("All credentials are required")
|
||||
sys.exit(1)
|
||||
|
||||
os.environ["TICKTICK_CLIENT_ID"] = TEST_CLIENT_ID
|
||||
os.environ["TICKTICK_CLIENT_SECRET"] = TEST_CLIENT_SECRET
|
||||
|
||||
print("\n" + "=" * 60)
|
||||
print("TICKTICK DEBUG TEST")
|
||||
print("=" * 60)
|
||||
|
||||
try:
|
||||
print("1. Testing OAuth client creation...")
|
||||
from services.ticktick.auth import create_oauth_client, get_token_file_path
|
||||
|
||||
oauth_client = create_oauth_client()
|
||||
print(f"✓ OAuth client created")
|
||||
print(f"✓ Expected cache path: {get_token_file_path()}")
|
||||
|
||||
# Check if we have a cached token
|
||||
token_file = get_token_file_path()
|
||||
print(f"✓ Token file exists: {token_file.exists()}")
|
||||
if token_file.exists():
|
||||
from services.ticktick.auth import load_stored_tokens
|
||||
|
||||
tokens = load_stored_tokens()
|
||||
if tokens:
|
||||
print(
|
||||
f"✓ Token loaded, expires: {tokens.get('readable_expire_time', 'Unknown')}"
|
||||
)
|
||||
else:
|
||||
print("⚠ Token file exists but couldn't load")
|
||||
|
||||
print("\n2. Testing OAuth token retrieval...")
|
||||
access_token = oauth_client.get_access_token()
|
||||
print(f"✓ Access token retrieved: {access_token[:10]}...{access_token[-10:]}")
|
||||
|
||||
print("\n3. Testing TickTick client creation...")
|
||||
from ticktick.api import TickTickClient
|
||||
|
||||
# Enable more verbose logging to see HTTP requests
|
||||
import urllib3
|
||||
|
||||
urllib3.disable_warnings()
|
||||
|
||||
# Monkey patch to get more details about the HTTP response
|
||||
original_check_status = TickTickClient.check_status_code
|
||||
|
||||
def debug_check_status(self, response, error_message):
|
||||
print(f"HTTP Response Status: {response.status_code}")
|
||||
print(f"HTTP Response Headers: {dict(response.headers)}")
|
||||
print(f"HTTP Response Text (first 200 chars): {response.text[:200]}")
|
||||
return original_check_status(self, response, error_message)
|
||||
|
||||
TickTickClient.check_status_code = debug_check_status
|
||||
|
||||
# This is where the error likely occurs
|
||||
print(f"Creating client with username: {TEST_USERNAME}")
|
||||
client = TickTickClient(TEST_USERNAME, TEST_PASSWORD, oauth_client)
|
||||
print("✓ TickTickClient created successfully!")
|
||||
print("\n4. Testing API call...")
|
||||
try:
|
||||
projects = client.get_by_fields(search="projects")
|
||||
print(f"✓ API call successful - found {len(projects)} projects")
|
||||
except Exception as api_e:
|
||||
print(f"⚠ API call failed: {api_e}")
|
||||
|
||||
print("\n🎉 ALL TESTS PASSED!")
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ ERROR: {e}")
|
||||
print(f"Error type: {type(e).__name__}")
|
||||
|
||||
import traceback
|
||||
|
||||
print("\nFull traceback:")
|
||||
traceback.print_exc()
|
||||
|
||||
# Additional debugging
|
||||
print("\nDebugging information:")
|
||||
print(f"- Python version: {sys.version}")
|
||||
print(f"- Working directory: {os.getcwd()}")
|
||||
print(f"- Token file path: {get_token_file_path()}")
|
||||
|
||||
# Check if this is the specific "Could Not Complete Request" error
|
||||
if "Could Not Complete Request" in str(e):
|
||||
print("""
|
||||
This error typically indicates one of:
|
||||
1. Incorrect TickTick username/password
|
||||
2. Account locked or requires 2FA
|
||||
3. Network/SSL issues (even with SSL disabled)
|
||||
4. TickTick API changes or service issues
|
||||
|
||||
Suggestions:
|
||||
- Double-check your TickTick login at https://ticktick.com
|
||||
- Try a different password (maybe you have special characters?)
|
||||
- Check if your account has 2FA enabled
|
||||
- Try again later (might be temporary API issue)
|
||||
""")
|
||||
|
||||
print("\n" + "=" * 60)
|
||||
110
demo_cancelled_workflow.py
Normal file
110
demo_cancelled_workflow.py
Normal file
@@ -0,0 +1,110 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Demo showing cancelled task workflow with Godspeed sync.
|
||||
"""
|
||||
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def demo_cancelled_workflow():
|
||||
print("=== Godspeed Cancelled Task Workflow Demo ===\n")
|
||||
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
sync_dir = Path(temp_dir)
|
||||
sync_engine = GodspeedSync(None, sync_dir)
|
||||
|
||||
print("📝 Scenario: Managing a project with tasks that get cancelled")
|
||||
print("=" * 65)
|
||||
|
||||
# Initial tasks
|
||||
print("\n1. Initial project tasks in markdown:")
|
||||
initial_tasks = [
|
||||
("task1", "incomplete", "Design new feature", ""),
|
||||
("task2", "incomplete", "Get approval from stakeholders", ""),
|
||||
("task3", "incomplete", "Implement feature", ""),
|
||||
("task4", "incomplete", "Write documentation", ""),
|
||||
("task5", "incomplete", "Deploy to production", ""),
|
||||
]
|
||||
|
||||
project_file = sync_dir / "New_Feature_Project.md"
|
||||
sync_engine._write_list_file(project_file, initial_tasks)
|
||||
|
||||
with open(project_file, "r") as f:
|
||||
print(f.read())
|
||||
|
||||
print("2. Project update - some tasks completed, one cancelled:")
|
||||
print("-" * 58)
|
||||
|
||||
# Simulate project evolution
|
||||
updated_content = """- [x] Design new feature <!-- id:task1 -->
|
||||
- [-] Get approval from stakeholders <!-- id:task2 -->
|
||||
Stakeholders decided to cancel this feature
|
||||
- [-] Implement feature <!-- id:task3 -->
|
||||
No longer needed since feature was cancelled
|
||||
- [-] Write documentation <!-- id:task4 -->
|
||||
Documentation not needed for cancelled feature
|
||||
- [-] Deploy to production <!-- id:task5 -->
|
||||
Cannot deploy cancelled feature
|
||||
- [ ] Archive project files <!-- id:task6 -->
|
||||
New cleanup task
|
||||
"""
|
||||
|
||||
with open(project_file, "w") as f:
|
||||
f.write(updated_content)
|
||||
|
||||
print(updated_content)
|
||||
|
||||
# Parse the changes
|
||||
updated_tasks = sync_engine._read_list_file(project_file)
|
||||
|
||||
print("3. What would sync to Godspeed API:")
|
||||
print("-" * 36)
|
||||
|
||||
api_calls = []
|
||||
for local_id, status, title, notes in updated_tasks:
|
||||
if status == "complete":
|
||||
api_calls.append(
|
||||
f"PATCH /tasks/{local_id} {{'is_complete': True, 'is_cleared': False}}"
|
||||
)
|
||||
print(f" ✅ COMPLETE: {title}")
|
||||
elif status == "cleared":
|
||||
api_calls.append(
|
||||
f"PATCH /tasks/{local_id} {{'is_complete': True, 'is_cleared': True}}"
|
||||
)
|
||||
print(f" ❌ CANCEL: {title}")
|
||||
if notes:
|
||||
print(f" Reason: {notes}")
|
||||
elif local_id == "task6": # New task
|
||||
api_calls.append(
|
||||
f"POST /tasks {{'title': '{title}', 'list_id': 'project-list'}}"
|
||||
)
|
||||
print(f" ➕ NEW: {title}")
|
||||
else:
|
||||
print(f" ⏳ INCOMPLETE: {title}")
|
||||
|
||||
print(f"\n4. API calls that would be made ({len(api_calls)} total):")
|
||||
print("-" * 49)
|
||||
for call in api_calls:
|
||||
print(f" {call}")
|
||||
|
||||
print("\n5. Next sync download behavior:")
|
||||
print("-" * 32)
|
||||
print(" When downloading from Godspeed API:")
|
||||
print(" • Only incomplete tasks appear in local files")
|
||||
print(" • Completed and cancelled tasks are hidden")
|
||||
print(" • This keeps your local markdown files clean")
|
||||
print(f" • Current file would only show: 'Archive project files'")
|
||||
|
||||
print("\n✨ Benefits of this workflow:")
|
||||
print(" • Clear visual distinction: [-] for cancelled vs [x] for completed")
|
||||
print(" • Cancelled tasks sync to Godspeed's 'cleared' status")
|
||||
print(" • Completed/cancelled tasks auto-hide on next download")
|
||||
print(" • Notes explain why tasks were cancelled")
|
||||
print(" • Clean local files focused on active work")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
demo_cancelled_workflow()
|
||||
103
demo_completion_sync.py
Normal file
103
demo_completion_sync.py
Normal file
@@ -0,0 +1,103 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Demo script showing how Godspeed completion status sync works.
|
||||
This creates sample markdown files and shows the sync behavior.
|
||||
"""
|
||||
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def demo_completion_sync():
|
||||
print("=== Godspeed Completion Status Sync Demo ===\n")
|
||||
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
sync_dir = Path(temp_dir)
|
||||
sync_engine = GodspeedSync(None, sync_dir)
|
||||
|
||||
print("1. Creating sample markdown file with mixed completion states:")
|
||||
print("-" * 60)
|
||||
|
||||
# Create sample tasks
|
||||
sample_tasks = [
|
||||
("task001", False, "Buy groceries", "Don't forget milk"),
|
||||
("task002", True, "Call dentist", ""),
|
||||
("task003", False, "Finish project", "Due next Friday"),
|
||||
("task004", True, "Exercise today", "Went for a 30min run"),
|
||||
]
|
||||
|
||||
# Write to markdown file
|
||||
demo_file = sync_dir / "Personal.md"
|
||||
sync_engine._write_list_file(demo_file, sample_tasks)
|
||||
|
||||
# Show the generated markdown
|
||||
with open(demo_file, "r") as f:
|
||||
content = f.read()
|
||||
|
||||
print(content)
|
||||
print("-" * 60)
|
||||
|
||||
print("\n2. What this represents in Godspeed:")
|
||||
for task_id, is_complete, title, notes in sample_tasks:
|
||||
status = "✅ COMPLETED" if is_complete else "⏳ INCOMPLETE"
|
||||
print(f" {status}: {title}")
|
||||
if notes:
|
||||
print(f" Notes: {notes}")
|
||||
|
||||
print("\n3. Now let's modify the markdown file (simulate user editing):")
|
||||
print("-" * 60)
|
||||
|
||||
# Simulate user changes - flip some completion states
|
||||
modified_content = content.replace(
|
||||
"- [ ] Buy groceries",
|
||||
"- [x] Buy groceries", # Mark as complete
|
||||
).replace(
|
||||
"- [x] Call dentist",
|
||||
"- [ ] Call dentist", # Mark as incomplete
|
||||
)
|
||||
|
||||
# Add a new task
|
||||
modified_content += "- [ ] New task from markdown <!-- id:task005 -->\n"
|
||||
|
||||
print(modified_content)
|
||||
print("-" * 60)
|
||||
|
||||
# Write the modified content
|
||||
with open(demo_file, "w") as f:
|
||||
f.write(modified_content)
|
||||
|
||||
# Parse the changes
|
||||
updated_tasks = sync_engine._read_list_file(demo_file)
|
||||
|
||||
print("\n4. Changes that would sync to Godspeed:")
|
||||
print("-" * 40)
|
||||
|
||||
for i, (task_id, is_complete, title, notes) in enumerate(updated_tasks):
|
||||
if i < len(sample_tasks):
|
||||
old_complete = sample_tasks[i][1]
|
||||
if old_complete != is_complete:
|
||||
action = "MARK COMPLETE" if is_complete else "MARK INCOMPLETE"
|
||||
print(f" 🔄 {action}: {title}")
|
||||
else:
|
||||
status = "✅" if is_complete else "⏳"
|
||||
print(f" {status} No change: {title}")
|
||||
else:
|
||||
print(f" ➕ CREATE NEW: {title}")
|
||||
|
||||
print("\n5. API calls that would be made:")
|
||||
print("-" * 35)
|
||||
print(" PATCH /tasks/task001 {'is_complete': True}")
|
||||
print(" PATCH /tasks/task002 {'is_complete': False}")
|
||||
print(" POST /tasks {'title': 'New task from markdown'}")
|
||||
|
||||
print("\n✨ Summary:")
|
||||
print(" • Checking [x] or [X] in markdown marks task complete in Godspeed")
|
||||
print(" • Unchecking [ ] in markdown marks task incomplete in Godspeed")
|
||||
print(" • Adding new tasks in markdown creates them in Godspeed")
|
||||
print(" • Changes sync both directions during 'godspeed sync'")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
demo_completion_sync()
|
||||
@@ -1,11 +1,21 @@
|
||||
import os
|
||||
import sys
|
||||
import logging
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
import msal
|
||||
import aiohttp
|
||||
|
||||
# Suppress debug logging from authentication and HTTP libraries
|
||||
logging.getLogger("msal").setLevel(logging.ERROR)
|
||||
logging.getLogger("urllib3").setLevel(logging.ERROR)
|
||||
logging.getLogger("requests").setLevel(logging.ERROR)
|
||||
logging.getLogger("requests_oauthlib").setLevel(logging.ERROR)
|
||||
logging.getLogger("aiohttp").setLevel(logging.ERROR)
|
||||
logging.getLogger("aiohttp.access").setLevel(logging.ERROR)
|
||||
logging.getLogger("asyncio").setLevel(logging.ERROR)
|
||||
|
||||
|
||||
from textual.app import App, ComposeResult
|
||||
from textual.binding import Binding
|
||||
@@ -24,11 +34,11 @@ from textual import work
|
||||
from textual.widgets.option_list import Option
|
||||
|
||||
# Import file icons utility - note the updated import
|
||||
from utils.file_icons import get_file_icon
|
||||
from src.utils.file_icons import get_file_icon
|
||||
|
||||
# Import our DocumentViewerScreen
|
||||
sys.path.append(os.path.join(os.path.dirname(__file__), "maildir_gtd"))
|
||||
from maildir_gtd.screens.DocumentViewer import DocumentViewerScreen
|
||||
sys.path.append(os.path.join(os.path.dirname(__file__), "src", "maildir_gtd"))
|
||||
from screens.DocumentViewer import DocumentViewerScreen
|
||||
|
||||
|
||||
class FolderHistoryEntry:
|
||||
|
||||
187
install.sh
Executable file
187
install.sh
Executable file
@@ -0,0 +1,187 @@
|
||||
#!/bin/bash
|
||||
# Installation script for luk
|
||||
|
||||
set -e
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Function to print colored output
|
||||
print_status() {
|
||||
echo -e "${GREEN}[INFO]${NC} $1"
|
||||
}
|
||||
|
||||
print_warning() {
|
||||
echo -e "${YELLOW}[WARNING]${NC} $1"
|
||||
}
|
||||
|
||||
print_error() {
|
||||
echo -e "${RED}[ERROR]${NC} $1"
|
||||
}
|
||||
|
||||
# Check if Python 3.12+ is installed
|
||||
check_python() {
|
||||
print_status "Checking Python installation..."
|
||||
|
||||
if command -v python3 &> /dev/null; then
|
||||
PYTHON_VERSION=$(python3 -c 'import sys; print(".".join(map(str, sys.version_info[:2])))')
|
||||
REQUIRED_VERSION="3.12"
|
||||
|
||||
if python3 -c "import sys; exit(0 if sys.version_info >= (3, 12) else 1)"; then
|
||||
print_status "Python $PYTHON_VERSION found ✓"
|
||||
else
|
||||
print_error "Python $REQUIRED_VERSION or higher is required. Found: $PYTHON_VERSION"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
print_error "Python 3 is not installed"
|
||||
exit 1
|
||||
fi
|
||||
}
|
||||
|
||||
# Check if uv is installed, install if not
|
||||
check_uv() {
|
||||
print_status "Checking uv installation..."
|
||||
|
||||
if command -v uv &> /dev/null; then
|
||||
print_status "uv found ✓"
|
||||
else
|
||||
print_warning "uv not found, installing..."
|
||||
curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||
export PATH="$HOME/.cargo/bin:$PATH"
|
||||
|
||||
if command -v uv &> /dev/null; then
|
||||
print_status "uv installed successfully ✓"
|
||||
else
|
||||
print_error "Failed to install uv"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
# Install the package
|
||||
install_package() {
|
||||
print_status "Installing luk..."
|
||||
|
||||
# Create virtual environment and install
|
||||
uv venv
|
||||
source .venv/bin/activate
|
||||
uv pip install -e .
|
||||
|
||||
print_status "Installation completed ✓"
|
||||
}
|
||||
|
||||
# Setup configuration directories
|
||||
setup_config() {
|
||||
print_status "Setting up configuration directories..."
|
||||
|
||||
# Create necessary directories
|
||||
mkdir -p "$HOME/.config/luk"
|
||||
mkdir -p "$HOME/.local/share/luk"
|
||||
mkdir -p "$HOME/.local/share/luk/logs"
|
||||
|
||||
# Create example configuration
|
||||
cat > "$HOME/.config/luk/config.env" << EOF
|
||||
# luk Configuration
|
||||
# Copy this file and modify as needed
|
||||
|
||||
# Microsoft Graph settings
|
||||
# These will be prompted for on first run
|
||||
# MICROSOFT_CLIENT_ID=your_client_id
|
||||
# MICROSOFT_TENANT_ID=your_tenant_id
|
||||
|
||||
# Email settings
|
||||
MAILDIR_PATH=~/Mail
|
||||
NOTES_DIR=~/Documents/Notes
|
||||
|
||||
# Godspeed settings
|
||||
# GODSPEED_EMAIL=your_email@example.com
|
||||
# GODSPEED_PASSWORD=your_password
|
||||
# GODSPEED_TOKEN=your_token
|
||||
# GODSPEED_SYNC_DIR=~/Documents/Godspeed
|
||||
|
||||
# TickTick settings
|
||||
# TICKTICK_CLIENT_ID=your_client_id
|
||||
# TICKTICK_CLIENT_SECRET=your_client_secret
|
||||
|
||||
# Sync settings
|
||||
DEFAULT_ORG=corteva
|
||||
DEFAULT_CALENDAR_DIR=~/Calendar
|
||||
SYNC_INTERVAL=300 # 5 minutes
|
||||
LOG_LEVEL=INFO
|
||||
EOF
|
||||
|
||||
print_status "Configuration directories created ✓"
|
||||
print_warning "Please edit $HOME/.config/luk/config.env with your settings"
|
||||
}
|
||||
|
||||
# Create shell completions
|
||||
setup_completions() {
|
||||
print_status "Setting up shell completions..."
|
||||
|
||||
# Get the shell type
|
||||
SHELL_TYPE=$(basename "$SHELL")
|
||||
|
||||
case $SHELL_TYPE in
|
||||
bash)
|
||||
echo 'eval "$(_LUK_COMPLETE=bash_source luk)"' >> "$HOME/.bashrc"
|
||||
print_status "Bash completions added to ~/.bashrc"
|
||||
;;
|
||||
zsh)
|
||||
echo 'eval "$(_LUK_COMPLETE=zsh_source luk)"' >> "$HOME/.zshrc"
|
||||
print_status "Zsh completions added to ~/.zshrc"
|
||||
;;
|
||||
fish)
|
||||
echo 'luk --completion | source' >> "$HOME/.config/fish/config.fish"
|
||||
print_status "Fish completions added to ~/.config/fish/config.fish"
|
||||
;;
|
||||
*)
|
||||
print_warning "Unsupported shell: $SHELL_TYPE"
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
||||
# Run tests
|
||||
run_tests() {
|
||||
print_status "Running tests..."
|
||||
|
||||
source .venv/bin/activate
|
||||
if uv run pytest tests/ -v; then
|
||||
print_status "All tests passed ✓"
|
||||
else
|
||||
print_warning "Some tests failed, but installation will continue"
|
||||
fi
|
||||
}
|
||||
|
||||
# Main installation flow
|
||||
main() {
|
||||
echo
|
||||
echo " luk - Look at your Outlook data locally"
|
||||
echo " ========================================="
|
||||
echo
|
||||
print_status "Starting luk installation..."
|
||||
|
||||
check_python
|
||||
check_uv
|
||||
install_package
|
||||
setup_config
|
||||
setup_completions
|
||||
run_tests
|
||||
|
||||
print_status "Installation completed successfully! 🎉"
|
||||
echo
|
||||
print_status "To get started:"
|
||||
echo " 1. Source your shell profile: source ~/.bashrc (or ~/.zshrc)"
|
||||
echo " 2. Configure your settings in ~/.config/luk/config.env"
|
||||
echo " 3. Run: luk sync --help"
|
||||
echo " 4. Try the dashboard: luk sync run --dashboard"
|
||||
echo " 5. Start the daemon: luk sync run --daemon"
|
||||
echo
|
||||
print_status "For more information, see: https://github.com/timothybendt/luk"
|
||||
}
|
||||
|
||||
# Run the installation
|
||||
main "$@"
|
||||
323
luk.egg-info/PKG-INFO
Normal file
323
luk.egg-info/PKG-INFO
Normal file
@@ -0,0 +1,323 @@
|
||||
Metadata-Version: 2.4
|
||||
Name: luk
|
||||
Version: 0.1.0
|
||||
Summary: A CLI tool for syncing Microsoft Outlook email, calendar, and tasks to local file-based formats. Look at your Outlook data locally.
|
||||
Author-email: Timothy Bendt <timothy@example.com>
|
||||
License: MIT
|
||||
Project-URL: Homepage, https://github.com/timothybendt/luk
|
||||
Project-URL: Repository, https://github.com/timothybendt/luk
|
||||
Project-URL: Issues, https://github.com/timothybendt/luk/issues
|
||||
Project-URL: Documentation, https://github.com/timothybendt/luk#readme
|
||||
Keywords: email,calendar,tasks,sync,cli,microsoft-graph,outlook,maildir,vdir
|
||||
Classifier: Development Status :: 4 - Beta
|
||||
Classifier: Environment :: Console
|
||||
Classifier: Intended Audience :: End Users/Desktop
|
||||
Classifier: License :: OSI Approved :: MIT License
|
||||
Classifier: Operating System :: OS Independent
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3.12
|
||||
Classifier: Programming Language :: Python :: 3.13
|
||||
Classifier: Topic :: Communications :: Email
|
||||
Classifier: Topic :: Office/Business :: Scheduling
|
||||
Classifier: Topic :: Utilities
|
||||
Requires-Python: >=3.12
|
||||
Description-Content-Type: text/markdown
|
||||
Requires-Dist: aiohttp>=3.11.18
|
||||
Requires-Dist: certifi>=2025.4.26
|
||||
Requires-Dist: click>=8.1.0
|
||||
Requires-Dist: html2text>=2025.4.15
|
||||
Requires-Dist: mammoth>=1.9.0
|
||||
Requires-Dist: markitdown[all]>=0.1.1
|
||||
Requires-Dist: msal>=1.32.3
|
||||
Requires-Dist: openai>=1.78.1
|
||||
Requires-Dist: orjson>=3.10.18
|
||||
Requires-Dist: pillow>=11.2.1
|
||||
Requires-Dist: python-dateutil>=2.9.0.post0
|
||||
Requires-Dist: python-docx>=1.1.2
|
||||
Requires-Dist: requests>=2.31.0
|
||||
Requires-Dist: rich>=14.0.0
|
||||
Requires-Dist: textual>=3.2.0
|
||||
Requires-Dist: textual-image>=0.8.2
|
||||
Requires-Dist: ticktick-py>=2.0.0
|
||||
|
||||
# luk
|
||||
|
||||
> Pronounced "look" - as in "look at your Outlook data locally"
|
||||
|
||||
A CLI tool for syncing Microsoft Outlook email, calendar, and tasks to local file-based formats like Maildir and vdir. Use your favorite terminal tools to manage your email and calendar.
|
||||
|
||||
## Features
|
||||
|
||||
- **Email Synchronization**: Sync emails with Microsoft Graph API to local Maildir format
|
||||
- **Calendar Management**: Two-way calendar sync with vdir/ICS support
|
||||
- **Task Integration**: Sync with Godspeed and TickTick task managers
|
||||
- **TUI Dashboard**: Interactive terminal dashboard for monitoring sync progress
|
||||
- **Daemon Mode**: Background daemon with proper Unix logging
|
||||
- **Cross-Platform**: Works on macOS, Linux, and Windows
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Python 3.12 or higher
|
||||
- `uv` package manager (recommended)
|
||||
|
||||
### Installation
|
||||
|
||||
```bash
|
||||
# Clone the repository
|
||||
git clone https://github.com/timothybendt/luk.git
|
||||
cd luk
|
||||
|
||||
# Run the installation script
|
||||
./install.sh
|
||||
```
|
||||
|
||||
### Manual Installation
|
||||
|
||||
```bash
|
||||
# Create virtual environment
|
||||
python3 -m venv .venv
|
||||
source .venv/bin/activate
|
||||
|
||||
# Install dependencies
|
||||
pip install -e .
|
||||
|
||||
# Setup configuration directories
|
||||
mkdir -p ~/.config/luk
|
||||
mkdir -p ~/.local/share/luk
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
Create a configuration file at `~/.config/luk/config.env`:
|
||||
|
||||
```bash
|
||||
# Microsoft Graph settings
|
||||
MICROSOFT_CLIENT_ID=your_client_id
|
||||
MICROSOFT_TENANT_ID=your_tenant_id
|
||||
|
||||
# Email settings
|
||||
MAILDIR_PATH=~/Mail
|
||||
NOTES_DIR=~/Documents/Notes
|
||||
|
||||
# Godspeed settings
|
||||
GODSPEED_EMAIL=your_email@example.com
|
||||
GODSPEED_PASSWORD=your_password
|
||||
GODSPEED_TOKEN=your_token
|
||||
GODSPEED_SYNC_DIR=~/Documents/Godspeed
|
||||
|
||||
# TickTick settings
|
||||
TICKTICK_CLIENT_ID=your_client_id
|
||||
TICKTICK_CLIENT_SECRET=your_client_secret
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### Basic Commands
|
||||
|
||||
```bash
|
||||
# Show help
|
||||
luk --help
|
||||
|
||||
# Run sync with default settings
|
||||
luk sync run
|
||||
|
||||
# Run with TUI dashboard
|
||||
luk sync run --dashboard
|
||||
|
||||
# Start daemon mode
|
||||
luk sync run --daemon
|
||||
|
||||
# Stop daemon
|
||||
luk sync stop
|
||||
|
||||
# Check daemon status
|
||||
luk sync status
|
||||
```
|
||||
|
||||
### Sync Options
|
||||
|
||||
```bash
|
||||
# Dry run (no changes)
|
||||
luk sync run --dry-run
|
||||
|
||||
# Specify organization
|
||||
luk sync run --org mycompany
|
||||
|
||||
# Enable notifications
|
||||
luk sync run --notify
|
||||
|
||||
# Download attachments
|
||||
luk sync run --download-attachments
|
||||
|
||||
# Two-way calendar sync
|
||||
luk sync run --two-way-calendar
|
||||
|
||||
# Custom calendar directory
|
||||
luk sync run --vdir ~/Calendars
|
||||
```
|
||||
|
||||
### Dashboard Mode
|
||||
|
||||
The TUI dashboard provides real-time monitoring of sync operations:
|
||||
|
||||
- **Status Display**: Current sync status and metrics
|
||||
- **Progress Bars**: Visual progress for each sync component
|
||||
- **Activity Log**: Scrollable log of all sync activities
|
||||
- **Keyboard Shortcuts**:
|
||||
- `q`: Quit dashboard
|
||||
- `l`: Toggle log visibility
|
||||
- `r`: Refresh status
|
||||
|
||||
### Daemon Mode
|
||||
|
||||
Run luk as a background daemon with proper Unix logging:
|
||||
|
||||
```bash
|
||||
# Start daemon
|
||||
luk sync run --daemon
|
||||
|
||||
# Check status
|
||||
luk sync status
|
||||
|
||||
# View logs
|
||||
cat ~/.local/share/luk/luk.log
|
||||
|
||||
# Stop daemon
|
||||
luk sync stop
|
||||
```
|
||||
|
||||
Daemon logs are stored at `~/.local/share/luk/luk.log` with automatic rotation.
|
||||
|
||||
## Architecture
|
||||
|
||||
### Core Components
|
||||
|
||||
- **Sync Engine**: Handles email, calendar, and task synchronization
|
||||
- **TUI Dashboard**: Interactive monitoring interface using Textual
|
||||
- **Daemon Service**: Background service with logging and process management
|
||||
- **Configuration**: Environment-based configuration system
|
||||
|
||||
### Directory Structure
|
||||
|
||||
```
|
||||
src/
|
||||
├── cli/ # CLI commands and interfaces
|
||||
│ ├── sync.py # Main sync command
|
||||
│ ├── sync_dashboard.py # TUI dashboard
|
||||
│ ├── sync_daemon.py # Daemon service
|
||||
│ └── ...
|
||||
├── services/ # External service integrations
|
||||
│ ├── microsoft_graph/ # Microsoft Graph API
|
||||
│ ├── godspeed/ # Godspeed task manager
|
||||
│ ├── ticktick/ # TickTick API
|
||||
│ └── ...
|
||||
└── utils/ # Utility functions
|
||||
```
|
||||
|
||||
## Development
|
||||
|
||||
### Setup Development Environment
|
||||
|
||||
```bash
|
||||
# Clone repository
|
||||
git clone https://github.com/timothybendt/luk.git
|
||||
cd luk
|
||||
|
||||
# Install development dependencies
|
||||
uv sync --dev
|
||||
|
||||
# Run tests
|
||||
uv run pytest
|
||||
|
||||
# Run linting
|
||||
uv run ruff check .
|
||||
uv run ruff format .
|
||||
|
||||
# Type checking
|
||||
uv run mypy src/
|
||||
```
|
||||
|
||||
### Project Structure
|
||||
|
||||
- `pyproject.toml`: Project configuration and dependencies
|
||||
- `src/cli/`: CLI commands and user interfaces
|
||||
- `src/services/`: External service integrations
|
||||
- `src/utils/`: Shared utilities and helpers
|
||||
- `tests/`: Test suite
|
||||
|
||||
### Building for Distribution
|
||||
|
||||
```bash
|
||||
# Build package
|
||||
uv run build
|
||||
|
||||
# Check package
|
||||
uv run twine check dist/*
|
||||
|
||||
# Upload to PyPI (for maintainers)
|
||||
uv run twine upload dist/*
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **Authentication Errors**: Ensure Microsoft Graph credentials are properly configured
|
||||
2. **Permission Denied**: Check file permissions for Maildir and calendar directories
|
||||
3. **Daemon Not Starting**: Verify log directory exists and is writable
|
||||
4. **TUI Not Rendering**: Ensure terminal supports Textual requirements
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable debug logging:
|
||||
|
||||
```bash
|
||||
export LOG_LEVEL=DEBUG
|
||||
luk sync run --dry-run
|
||||
```
|
||||
|
||||
### Log Files
|
||||
|
||||
- **Daemon Logs**: `~/.local/share/luk/luk.log`
|
||||
- **Sync State**: `~/.local/share/luk/sync_state.json`
|
||||
- **Configuration**: `~/.config/luk/`
|
||||
|
||||
## Contributing
|
||||
|
||||
1. Fork the repository
|
||||
2. Create a feature branch
|
||||
3. Make your changes
|
||||
4. Add tests for new functionality
|
||||
5. Run the test suite
|
||||
6. Submit a pull request
|
||||
|
||||
### Code Style
|
||||
|
||||
This project uses:
|
||||
- **Ruff** for linting and formatting
|
||||
- **MyPy** for type checking
|
||||
- **Black** for code formatting
|
||||
- **Pre-commit** hooks for quality control
|
||||
|
||||
## License
|
||||
|
||||
MIT License - see LICENSE file for details.
|
||||
|
||||
## Support
|
||||
|
||||
- **Issues**: [GitHub Issues](https://github.com/timothybendt/luk/issues)
|
||||
- **Documentation**: [GitHub Wiki](https://github.com/timothybendt/luk/wiki)
|
||||
- **Discussions**: [GitHub Discussions](https://github.com/timothybendt/luk/discussions)
|
||||
|
||||
## Changelog
|
||||
|
||||
### v0.1.0
|
||||
- Initial release
|
||||
- Email synchronization with Microsoft Graph
|
||||
- Calendar sync with vdir/ICS support
|
||||
- Godspeed and TickTick integration
|
||||
- TUI dashboard
|
||||
- Daemon mode with logging
|
||||
- Cross-platform support
|
||||
76
luk.egg-info/SOURCES.txt
Normal file
76
luk.egg-info/SOURCES.txt
Normal file
@@ -0,0 +1,76 @@
|
||||
README.md
|
||||
pyproject.toml
|
||||
luk.egg-info/PKG-INFO
|
||||
luk.egg-info/SOURCES.txt
|
||||
luk.egg-info/dependency_links.txt
|
||||
luk.egg-info/entry_points.txt
|
||||
luk.egg-info/requires.txt
|
||||
luk.egg-info/top_level.txt
|
||||
src/cli/__init__.py
|
||||
src/cli/__main__.py
|
||||
src/cli/calendar.py
|
||||
src/cli/drive.py
|
||||
src/cli/email.py
|
||||
src/cli/gitlab_monitor.py
|
||||
src/cli/godspeed.py
|
||||
src/cli/sync.py
|
||||
src/cli/sync_daemon.py
|
||||
src/cli/sync_dashboard.py
|
||||
src/cli/ticktick.py
|
||||
src/maildir_gtd/__init__.py
|
||||
src/maildir_gtd/app.py
|
||||
src/maildir_gtd/email_viewer.tcss
|
||||
src/maildir_gtd/message_store.py
|
||||
src/maildir_gtd/utils.py
|
||||
src/maildir_gtd/actions/__init__.py
|
||||
src/maildir_gtd/actions/archive.py
|
||||
src/maildir_gtd/actions/delete.py
|
||||
src/maildir_gtd/actions/newest.py
|
||||
src/maildir_gtd/actions/next.py
|
||||
src/maildir_gtd/actions/oldest.py
|
||||
src/maildir_gtd/actions/open.py
|
||||
src/maildir_gtd/actions/previous.py
|
||||
src/maildir_gtd/actions/show_message.py
|
||||
src/maildir_gtd/actions/task.py
|
||||
src/maildir_gtd/screens/CreateTask.py
|
||||
src/maildir_gtd/screens/DocumentViewer.py
|
||||
src/maildir_gtd/screens/OpenMessage.py
|
||||
src/maildir_gtd/screens/__init__.py
|
||||
src/maildir_gtd/widgets/ContentContainer.py
|
||||
src/maildir_gtd/widgets/EnvelopeHeader.py
|
||||
src/maildir_gtd/widgets/__init__.py
|
||||
src/services/__init__.py
|
||||
src/services/gitlab_monitor/__init__.py
|
||||
src/services/gitlab_monitor/config.py
|
||||
src/services/gitlab_monitor/daemon.py
|
||||
src/services/gitlab_monitor/gitlab_client.py
|
||||
src/services/gitlab_monitor/notifications.py
|
||||
src/services/gitlab_monitor/openai_analyzer.py
|
||||
src/services/godspeed/__init__.py
|
||||
src/services/godspeed/client.py
|
||||
src/services/godspeed/config.py
|
||||
src/services/godspeed/sync.py
|
||||
src/services/himalaya/__init__.py
|
||||
src/services/himalaya/client.py
|
||||
src/services/microsoft_graph/__init__.py
|
||||
src/services/microsoft_graph/auth.py
|
||||
src/services/microsoft_graph/calendar.py
|
||||
src/services/microsoft_graph/client.py
|
||||
src/services/microsoft_graph/mail.py
|
||||
src/services/taskwarrior/__init__.py
|
||||
src/services/taskwarrior/client.py
|
||||
src/services/ticktick/__init__.py
|
||||
src/services/ticktick/auth.py
|
||||
src/services/ticktick/client.py
|
||||
src/services/ticktick/direct_client.py
|
||||
src/utils/calendar_utils.py
|
||||
src/utils/file_icons.py
|
||||
src/utils/notifications.py
|
||||
src/utils/platform.py
|
||||
src/utils/ticktick_utils.py
|
||||
src/utils/mail_utils/__init__.py
|
||||
src/utils/mail_utils/helpers.py
|
||||
src/utils/mail_utils/maildir.py
|
||||
tests/test_platform.py
|
||||
tests/test_sync_daemon.py
|
||||
tests/test_sync_dashboard.py
|
||||
1
luk.egg-info/dependency_links.txt
Normal file
1
luk.egg-info/dependency_links.txt
Normal file
@@ -0,0 +1 @@
|
||||
|
||||
2
luk.egg-info/entry_points.txt
Normal file
2
luk.egg-info/entry_points.txt
Normal file
@@ -0,0 +1,2 @@
|
||||
[console_scripts]
|
||||
luk = src.cli.__main__:main
|
||||
17
luk.egg-info/requires.txt
Normal file
17
luk.egg-info/requires.txt
Normal file
@@ -0,0 +1,17 @@
|
||||
aiohttp>=3.11.18
|
||||
certifi>=2025.4.26
|
||||
click>=8.1.0
|
||||
html2text>=2025.4.15
|
||||
mammoth>=1.9.0
|
||||
markitdown[all]>=0.1.1
|
||||
msal>=1.32.3
|
||||
openai>=1.78.1
|
||||
orjson>=3.10.18
|
||||
pillow>=11.2.1
|
||||
python-dateutil>=2.9.0.post0
|
||||
python-docx>=1.1.2
|
||||
requests>=2.31.0
|
||||
rich>=14.0.0
|
||||
textual>=3.2.0
|
||||
textual-image>=0.8.2
|
||||
ticktick-py>=2.0.0
|
||||
1
luk.egg-info/top_level.txt
Normal file
1
luk.egg-info/top_level.txt
Normal file
@@ -0,0 +1 @@
|
||||
src
|
||||
7
mise.toml
Normal file
7
mise.toml
Normal file
@@ -0,0 +1,7 @@
|
||||
[tools]
|
||||
bun = "latest"
|
||||
node = "22.17.1"
|
||||
uv = 'latest'
|
||||
|
||||
[settings]
|
||||
python.uv_venv_auto = true
|
||||
107
pyproject.toml
107
pyproject.toml
@@ -1,11 +1,31 @@
|
||||
[project]
|
||||
name = "gtd-terminal-tools"
|
||||
name = "luk"
|
||||
version = "0.1.0"
|
||||
description = "Add your description here"
|
||||
description = "A CLI tool for syncing Microsoft Outlook email, calendar, and tasks to local file-based formats. Look at your Outlook data locally."
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.12"
|
||||
license = {text = "MIT"}
|
||||
authors = [
|
||||
{name = "Timothy Bendt", email = "timothy@example.com"}
|
||||
]
|
||||
keywords = ["email", "calendar", "tasks", "sync", "cli", "microsoft-graph", "outlook", "maildir", "vdir"]
|
||||
classifiers = [
|
||||
"Development Status :: 4 - Beta",
|
||||
"Environment :: Console",
|
||||
"Intended Audience :: End Users/Desktop",
|
||||
"License :: OSI Approved :: MIT License",
|
||||
"Operating System :: OS Independent",
|
||||
"Programming Language :: Python :: 3",
|
||||
"Programming Language :: Python :: 3.12",
|
||||
"Programming Language :: Python :: 3.13",
|
||||
"Topic :: Communications :: Email",
|
||||
"Topic :: Office/Business :: Scheduling",
|
||||
"Topic :: Utilities",
|
||||
]
|
||||
dependencies = [
|
||||
"aiohttp>=3.11.18",
|
||||
"certifi>=2025.4.26",
|
||||
"click>=8.1.0",
|
||||
"html2text>=2025.4.15",
|
||||
"mammoth>=1.9.0",
|
||||
"markitdown[all]>=0.1.1",
|
||||
@@ -15,13 +35,96 @@ dependencies = [
|
||||
"pillow>=11.2.1",
|
||||
"python-dateutil>=2.9.0.post0",
|
||||
"python-docx>=1.1.2",
|
||||
"requests>=2.31.0",
|
||||
"rich>=14.0.0",
|
||||
"textual>=3.2.0",
|
||||
"textual-image>=0.8.2",
|
||||
"ticktick-py>=2.0.0",
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
luk = "src.cli.__main__:main"
|
||||
|
||||
[project.urls]
|
||||
Homepage = "https://github.com/timothybendt/luk"
|
||||
Repository = "https://github.com/timothybendt/luk"
|
||||
Issues = "https://github.com/timothybendt/luk/issues"
|
||||
Documentation = "https://github.com/timothybendt/luk#readme"
|
||||
|
||||
[dependency-groups]
|
||||
dev = [
|
||||
"ruff>=0.11.8",
|
||||
"textual>=3.2.0",
|
||||
"pytest>=8.0.0",
|
||||
"pytest-asyncio>=0.24.0",
|
||||
"pytest-cov>=6.0.0",
|
||||
"black>=24.0.0",
|
||||
"mypy>=1.8.0",
|
||||
"pre-commit>=3.5.0",
|
||||
"build>=1.0.0",
|
||||
"twine>=5.0.0",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
line-length = 88
|
||||
target-version = "py312"
|
||||
select = ["E", "F", "W", "I", "N", "UP", "B", "A", "C4", "DTZ", "T10", "EM", "ISC", "ICN", "G", "PIE", "PYI", "PT", "Q", "RSE", "RET", "SIM", "TID", "TCH", "ARG", "PTH", "ERA", "PGH", "PL", "TRY", "NPY", "RUF"]
|
||||
ignore = ["E501", "PLR0913", "PLR0915"]
|
||||
|
||||
[tool.ruff.format]
|
||||
quote-style = "double"
|
||||
indent-style = "space"
|
||||
|
||||
[tool.mypy]
|
||||
python_version = "3.12"
|
||||
warn_return_any = true
|
||||
warn_unused_configs = true
|
||||
disallow_untyped_defs = true
|
||||
disallow_incomplete_defs = true
|
||||
check_untyped_defs = true
|
||||
disallow_untyped_decorators = true
|
||||
no_implicit_optional = true
|
||||
warn_redundant_casts = true
|
||||
warn_unused_ignores = true
|
||||
warn_no_return = true
|
||||
warn_unreachable = true
|
||||
strict_equality = true
|
||||
|
||||
[tool.pytest.ini_options]
|
||||
testpaths = ["tests"]
|
||||
python_files = ["test_*.py", "*_test.py"]
|
||||
python_classes = ["Test*"]
|
||||
python_functions = ["test_*"]
|
||||
addopts = [
|
||||
"--cov=src",
|
||||
"--cov-report=term-missing",
|
||||
"--cov-report=html",
|
||||
"--cov-fail-under=80",
|
||||
"-v"
|
||||
]
|
||||
asyncio_mode = "auto"
|
||||
|
||||
[tool.black]
|
||||
line-length = 88
|
||||
target-version = ['py312']
|
||||
include = '\.pyi?$'
|
||||
extend-exclude = '''
|
||||
/(
|
||||
# directories
|
||||
\.eggs
|
||||
| \.git
|
||||
| \.hg
|
||||
| \.mypy_cache
|
||||
| \.tox
|
||||
| \.venv
|
||||
| build
|
||||
| dist
|
||||
)/
|
||||
'''
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
where = ["."]
|
||||
include = ["src*"]
|
||||
|
||||
[tool.setuptools.package-data]
|
||||
"*" = ["*.tcss", "*.css", "*.json", "*.md"]
|
||||
|
||||
152
sendmail
Executable file
152
sendmail
Executable file
@@ -0,0 +1,152 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Sendmail-compatible wrapper for Microsoft Graph email sending.
|
||||
Queues emails in maildir format for processing by the sync daemon.
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
import time
|
||||
import logging
|
||||
from email.parser import Parser
|
||||
from email.utils import parseaddr
|
||||
|
||||
# Add the project root to Python path
|
||||
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
from src.utils.mail_utils.helpers import ensure_directory_exists
|
||||
|
||||
|
||||
def extract_org_from_email(email_address: str) -> str:
|
||||
"""
|
||||
Extract organization name from email address domain.
|
||||
|
||||
Args:
|
||||
email_address: Email address like "user@corteva.com"
|
||||
|
||||
Returns:
|
||||
Organization name (e.g., "corteva")
|
||||
"""
|
||||
if "@" not in email_address:
|
||||
return "default"
|
||||
|
||||
domain = email_address.split("@")[1].lower()
|
||||
|
||||
# Map known domains to org names
|
||||
domain_to_org = {
|
||||
"corteva.com": "corteva",
|
||||
# Add more domain mappings as needed
|
||||
}
|
||||
|
||||
return domain_to_org.get(domain, domain.split(".")[0])
|
||||
|
||||
|
||||
def create_outbox_structure(base_path: str, org: str):
|
||||
"""
|
||||
Create maildir structure for outbox.
|
||||
|
||||
Args:
|
||||
base_path: Base maildir path (e.g., ~/Mail)
|
||||
org: Organization name
|
||||
"""
|
||||
org_path = os.path.join(base_path, org, "outbox")
|
||||
ensure_directory_exists(os.path.join(org_path, "new"))
|
||||
ensure_directory_exists(os.path.join(org_path, "cur"))
|
||||
ensure_directory_exists(os.path.join(org_path, "tmp"))
|
||||
ensure_directory_exists(os.path.join(org_path, "failed"))
|
||||
|
||||
|
||||
def queue_email(email_content: str, org: str) -> bool:
|
||||
"""
|
||||
Queue email in maildir outbox for sending.
|
||||
|
||||
Args:
|
||||
email_content: Raw email content
|
||||
org: Organization name
|
||||
|
||||
Returns:
|
||||
True if queued successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
# Get base maildir path
|
||||
base_path = os.path.expanduser(os.getenv("MAILDIR_PATH", "~/Mail"))
|
||||
|
||||
# Create outbox structure
|
||||
create_outbox_structure(base_path, org)
|
||||
|
||||
# Generate unique filename
|
||||
timestamp = str(int(time.time() * 1000000))
|
||||
hostname = os.uname().nodename
|
||||
filename = f"{timestamp}.{os.getpid()}.{hostname}"
|
||||
|
||||
# Write to tmp first, then move to new (atomic operation)
|
||||
tmp_path = os.path.join(base_path, org, "outbox", "tmp", filename)
|
||||
new_path = os.path.join(base_path, org, "outbox", "new", filename)
|
||||
|
||||
with open(tmp_path, "w", encoding="utf-8") as f:
|
||||
f.write(email_content)
|
||||
|
||||
os.rename(tmp_path, new_path)
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logging.error(f"Failed to queue email: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def main():
|
||||
"""
|
||||
Main sendmail wrapper function.
|
||||
Reads email from stdin and queues it for sending.
|
||||
"""
|
||||
# Set up basic logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
|
||||
handlers=[
|
||||
logging.FileHandler(os.path.expanduser("~/Mail/sendmail.log")),
|
||||
]
|
||||
)
|
||||
|
||||
try:
|
||||
# Read email from stdin
|
||||
email_content = sys.stdin.read()
|
||||
|
||||
if not email_content.strip():
|
||||
logging.error("No email content received")
|
||||
sys.exit(1)
|
||||
|
||||
# Parse email to extract From header
|
||||
parser = Parser()
|
||||
msg = parser.parsestr(email_content)
|
||||
|
||||
from_header = msg.get("From", "")
|
||||
if not from_header:
|
||||
logging.error("No From header found in email")
|
||||
sys.exit(1)
|
||||
|
||||
# Extract email address from From header
|
||||
_, from_email = parseaddr(from_header)
|
||||
if not from_email:
|
||||
logging.error(f"Could not parse email address from From header: {from_header}")
|
||||
sys.exit(1)
|
||||
|
||||
# Determine organization from email domain
|
||||
org = extract_org_from_email(from_email)
|
||||
|
||||
# Queue the email
|
||||
if queue_email(email_content, org):
|
||||
logging.info(f"Email queued successfully for org: {org}, from: {from_email}")
|
||||
sys.exit(0)
|
||||
else:
|
||||
logging.error("Failed to queue email")
|
||||
sys.exit(1)
|
||||
|
||||
except Exception as e:
|
||||
logging.error(f"Sendmail wrapper error: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -26,13 +26,13 @@
|
||||
gsub(/mailto:[^[:space:]]*/, "")
|
||||
|
||||
# Clean up email headers - make them bold
|
||||
if (/^From:/) { gsub(/^From:[[:space:]]*/, "**From:** ") }
|
||||
if (/^To:/) { gsub(/^To:[[:space:]]*/, "**To:** ") }
|
||||
if (/^Subject:/) { gsub(/^Subject:[[:space:]]*/, "**Subject:** ") }
|
||||
if (/^Date:/) { gsub(/^Date:[[:space:]]*/, "**Date:** ") }
|
||||
# if (/^From:/) { gsub(/^From:[[:space:]]*/, "**From:** ") }
|
||||
# if (/^To:/) { gsub(/^To:[[:space:]]*/, "**To:** ") }
|
||||
# if (/^Subject:/) { gsub(/^Subject:[[:space:]]*/, "**Subject:** ") }
|
||||
# if (/^Date:/) { gsub(/^Date:[[:space:]]*/, "**Date:** ") }
|
||||
|
||||
# Skip empty lines
|
||||
if (/^[[:space:]]*$/) next
|
||||
|
||||
print
|
||||
}
|
||||
}
|
||||
|
||||
@@ -6,6 +6,9 @@ from .sync import sync
|
||||
from .drive import drive
|
||||
from .email import email
|
||||
from .calendar import calendar
|
||||
from .ticktick import ticktick
|
||||
from .godspeed import godspeed
|
||||
from .gitlab_monitor import gitlab_monitor
|
||||
|
||||
|
||||
@click.group()
|
||||
@@ -18,3 +21,13 @@ cli.add_command(sync)
|
||||
cli.add_command(drive)
|
||||
cli.add_command(email)
|
||||
cli.add_command(calendar)
|
||||
cli.add_command(ticktick)
|
||||
cli.add_command(godspeed)
|
||||
cli.add_command(gitlab_monitor)
|
||||
|
||||
# Add 'tt' as a short alias for ticktick
|
||||
cli.add_command(ticktick, name="tt")
|
||||
# Add 'gs' as a short alias for godspeed
|
||||
cli.add_command(godspeed, name="gs")
|
||||
# Add 'glm' as a short alias for gitlab_monitor
|
||||
cli.add_command(gitlab_monitor, name="glm")
|
||||
|
||||
@@ -1,4 +1,10 @@
|
||||
from . import cli
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
def main():
|
||||
"""Main entry point for the CLI."""
|
||||
cli()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
|
||||
@@ -1,9 +1,12 @@
|
||||
import click
|
||||
import subprocess
|
||||
import os
|
||||
|
||||
|
||||
@click.command()
|
||||
def drive():
|
||||
"""View OneDrive files."""
|
||||
click.echo("Launching OneDrive viewer...")
|
||||
subprocess.run(["python3", "src/drive_view_tui.py"])
|
||||
# Get the directory containing this file, then go up to project root
|
||||
current_dir = os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
|
||||
subprocess.run(["python3", "drive_view_tui.py"], cwd=current_dir)
|
||||
|
||||
152
src/cli/gitlab_monitor.py
Normal file
152
src/cli/gitlab_monitor.py
Normal file
@@ -0,0 +1,152 @@
|
||||
import click
|
||||
import asyncio
|
||||
import os
|
||||
import signal
|
||||
import subprocess
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
@click.group()
|
||||
def gitlab_monitor():
|
||||
"""GitLab pipeline monitoring daemon."""
|
||||
pass
|
||||
|
||||
|
||||
@gitlab_monitor.command()
|
||||
@click.option("--config", help="Path to configuration file")
|
||||
@click.option("--daemon", "-d", is_flag=True, help="Run in background as daemon")
|
||||
def start(config, daemon):
|
||||
"""Start the GitLab pipeline monitoring daemon."""
|
||||
daemon_path = os.path.join(
|
||||
os.path.dirname(__file__), "..", "services", "gitlab_monitor", "daemon.py"
|
||||
)
|
||||
|
||||
if daemon:
|
||||
# Run as background daemon
|
||||
click.echo("Starting GitLab pipeline monitor daemon in background...")
|
||||
|
||||
cmd = [sys.executable, daemon_path]
|
||||
if config:
|
||||
cmd.extend(["--config", config])
|
||||
|
||||
# Create pid file
|
||||
pid_file = os.path.expanduser("~/.config/luk/gitlab_monitor.pid")
|
||||
Path(pid_file).parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Start daemon process
|
||||
process = subprocess.Popen(
|
||||
cmd,
|
||||
stdout=subprocess.DEVNULL,
|
||||
stderr=subprocess.DEVNULL,
|
||||
preexec_fn=os.setsid,
|
||||
)
|
||||
|
||||
# Save PID
|
||||
with open(pid_file, "w") as f:
|
||||
f.write(str(process.pid))
|
||||
|
||||
click.echo(f"Daemon started with PID {process.pid}")
|
||||
click.echo(f"PID file: {pid_file}")
|
||||
else:
|
||||
# Run in foreground
|
||||
click.echo("Starting GitLab pipeline monitor (press Ctrl+C to stop)...")
|
||||
|
||||
# Import and run the daemon
|
||||
from src.services.gitlab_monitor.daemon import main
|
||||
|
||||
asyncio.run(main())
|
||||
|
||||
|
||||
@gitlab_monitor.command()
|
||||
def stop():
|
||||
"""Stop the GitLab pipeline monitoring daemon."""
|
||||
pid_file = os.path.expanduser("~/.config/luk/gitlab_monitor.pid")
|
||||
|
||||
if not os.path.exists(pid_file):
|
||||
click.echo("Daemon is not running (no PID file found)")
|
||||
return
|
||||
|
||||
try:
|
||||
with open(pid_file, "r") as f:
|
||||
pid = int(f.read().strip())
|
||||
|
||||
# Send SIGTERM to process group
|
||||
os.killpg(os.getpgid(pid), signal.SIGTERM)
|
||||
|
||||
# Remove PID file
|
||||
os.unlink(pid_file)
|
||||
|
||||
click.echo(f"Daemon stopped (PID {pid})")
|
||||
except (ValueError, ProcessLookupError, OSError) as e:
|
||||
click.echo(f"Error stopping daemon: {e}")
|
||||
# Clean up stale PID file
|
||||
if os.path.exists(pid_file):
|
||||
os.unlink(pid_file)
|
||||
|
||||
|
||||
@gitlab_monitor.command()
|
||||
def status():
|
||||
"""Check the status of the GitLab pipeline monitoring daemon."""
|
||||
pid_file = os.path.expanduser("~/.config/luk/gitlab_monitor.pid")
|
||||
|
||||
if not os.path.exists(pid_file):
|
||||
click.echo("Daemon is not running")
|
||||
return
|
||||
|
||||
try:
|
||||
with open(pid_file, "r") as f:
|
||||
pid = int(f.read().strip())
|
||||
|
||||
# Check if process exists
|
||||
os.kill(pid, 0) # Send signal 0 to check if process exists
|
||||
click.echo(f"Daemon is running (PID {pid})")
|
||||
except (ValueError, ProcessLookupError, OSError):
|
||||
click.echo("Daemon is not running (stale PID file)")
|
||||
# Clean up stale PID file
|
||||
os.unlink(pid_file)
|
||||
|
||||
|
||||
@gitlab_monitor.command()
|
||||
@click.option("--config", help="Path to configuration file")
|
||||
def test(config):
|
||||
"""Test the configuration and dependencies."""
|
||||
from src.services.gitlab_monitor.daemon import GitLabPipelineMonitor
|
||||
|
||||
monitor = GitLabPipelineMonitor(config)
|
||||
|
||||
click.echo("Configuration test:")
|
||||
click.echo(
|
||||
f"GitLab token configured: {'✓' if monitor.config.get_gitlab_token() else '✗'}"
|
||||
)
|
||||
click.echo(
|
||||
f"OpenAI key configured: {'✓' if monitor.config.get_openai_key() else '✗'}"
|
||||
)
|
||||
click.echo(f"Subject patterns: {monitor.config.get_subject_patterns()}")
|
||||
click.echo(f"Sender patterns: {monitor.config.get_sender_patterns()}")
|
||||
click.echo(f"Check interval: {monitor.config.get_check_interval()}s")
|
||||
click.echo(f"Config file: {monitor.config.config_path}")
|
||||
|
||||
|
||||
@gitlab_monitor.command()
|
||||
def config():
|
||||
"""Show the current configuration file path and create default if needed."""
|
||||
from src.services.gitlab_monitor.config import GitLabMonitorConfig
|
||||
|
||||
config = GitLabMonitorConfig()
|
||||
click.echo(f"Configuration file: {config.config_path}")
|
||||
|
||||
if os.path.exists(config.config_path):
|
||||
click.echo("Configuration file exists")
|
||||
else:
|
||||
click.echo("Default configuration file created")
|
||||
|
||||
click.echo("\nTo configure the daemon:")
|
||||
click.echo("1. Set environment variables:")
|
||||
click.echo(" export GITLAB_API_TOKEN='your_gitlab_token'")
|
||||
click.echo(" export OPENAI_API_KEY='your_openai_key'")
|
||||
click.echo("2. Or edit the configuration file directly")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
gitlab_monitor()
|
||||
616
src/cli/godspeed.py
Normal file
616
src/cli/godspeed.py
Normal file
@@ -0,0 +1,616 @@
|
||||
"""CLI interface for Godspeed sync functionality."""
|
||||
|
||||
import click
|
||||
import getpass
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from datetime import datetime
|
||||
|
||||
from ..services.godspeed.client import GodspeedClient
|
||||
from ..services.godspeed.sync import GodspeedSync
|
||||
|
||||
|
||||
def get_credentials():
|
||||
"""Get Godspeed credentials from environment or user input."""
|
||||
email = os.getenv("GODSPEED_EMAIL")
|
||||
password = os.getenv("GODSPEED_PASSWORD")
|
||||
token = os.getenv("GODSPEED_TOKEN")
|
||||
|
||||
if token:
|
||||
return None, None, token
|
||||
|
||||
if not email:
|
||||
email = click.prompt("Godspeed email")
|
||||
|
||||
if not password:
|
||||
password = click.prompt("Godspeed password", hide_input=True)
|
||||
|
||||
return email, password, None
|
||||
|
||||
|
||||
def get_sync_directory():
|
||||
"""Get sync directory from environment or default."""
|
||||
sync_dir = os.getenv("GODSPEED_SYNC_DIR")
|
||||
if sync_dir:
|
||||
return Path(sync_dir)
|
||||
|
||||
# Default to ~/Documents/Godspeed or ~/.local/share/gtd-terminal-tools/godspeed
|
||||
home = Path.home()
|
||||
|
||||
# Try Documents first
|
||||
docs_dir = home / "Documents" / "Godspeed"
|
||||
if docs_dir.parent.exists():
|
||||
return docs_dir
|
||||
|
||||
# Fall back to data directory
|
||||
data_dir = home / ".local" / "share" / "gtd-terminal-tools" / "godspeed"
|
||||
return data_dir
|
||||
|
||||
|
||||
@click.group()
|
||||
def godspeed():
|
||||
"""Godspeed sync tool - bidirectional sync between Godspeed API and markdown files."""
|
||||
pass
|
||||
|
||||
|
||||
@godspeed.command()
|
||||
def download():
|
||||
"""Download tasks from Godspeed API to local files."""
|
||||
email, password, token = get_credentials()
|
||||
sync_dir = get_sync_directory()
|
||||
|
||||
try:
|
||||
client = GodspeedClient(email=email, password=password, token=token)
|
||||
sync_engine = GodspeedSync(client, sync_dir)
|
||||
sync_engine.download_from_api()
|
||||
|
||||
click.echo(f"\nTasks downloaded to: {sync_dir}")
|
||||
click.echo(
|
||||
"You can now edit the markdown files and run 'godspeed upload' to sync changes back."
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
click.echo(f"Error during download: {e}", err=True)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
@godspeed.command()
|
||||
def upload():
|
||||
"""Upload local markdown files to Godspeed API."""
|
||||
email, password, token = get_credentials()
|
||||
sync_dir = get_sync_directory()
|
||||
|
||||
if not sync_dir.exists():
|
||||
click.echo(f"Sync directory does not exist: {sync_dir}", err=True)
|
||||
click.echo("Run 'godspeed download' first to initialize the sync directory.")
|
||||
sys.exit(1)
|
||||
|
||||
try:
|
||||
client = GodspeedClient(email=email, password=password, token=token)
|
||||
sync_engine = GodspeedSync(client, sync_dir)
|
||||
sync_engine.upload_to_api()
|
||||
|
||||
click.echo("Local changes uploaded successfully.")
|
||||
|
||||
except Exception as e:
|
||||
click.echo(f"Error during upload: {e}", err=True)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
@godspeed.command()
|
||||
def sync():
|
||||
"""Perform bidirectional sync between local files and Godspeed API."""
|
||||
email, password, token = get_credentials()
|
||||
sync_dir = get_sync_directory()
|
||||
|
||||
try:
|
||||
client = GodspeedClient(email=email, password=password, token=token)
|
||||
sync_engine = GodspeedSync(client, sync_dir)
|
||||
sync_engine.sync_bidirectional()
|
||||
|
||||
click.echo(f"\nSync complete. Files are in: {sync_dir}")
|
||||
|
||||
except Exception as e:
|
||||
click.echo(f"Error during sync: {e}", err=True)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
@godspeed.command()
|
||||
def status():
|
||||
"""Show sync status and directory information."""
|
||||
sync_dir = get_sync_directory()
|
||||
|
||||
if not sync_dir.exists():
|
||||
click.echo(f"Sync directory does not exist: {sync_dir}")
|
||||
click.echo("Run 'godspeed download' or 'godspeed sync' to initialize.")
|
||||
return
|
||||
|
||||
# Create a minimal sync engine for status (no API client needed)
|
||||
sync_engine = GodspeedSync(None, sync_dir)
|
||||
status_info = sync_engine.get_sync_status()
|
||||
|
||||
click.echo(f"Sync Directory: {status_info['sync_directory']}")
|
||||
click.echo(f"Local Files: {status_info['local_files']}")
|
||||
click.echo(f"Total Local Tasks: {status_info['total_local_tasks']}")
|
||||
click.echo(f"Tracked Tasks: {status_info['tracked_tasks']}")
|
||||
click.echo(f"Tracked Lists: {status_info['tracked_lists']}")
|
||||
|
||||
if status_info["last_sync"]:
|
||||
click.echo(f"Last Sync: {status_info['last_sync']}")
|
||||
else:
|
||||
click.echo("Last Sync: Never")
|
||||
|
||||
click.echo("\nMarkdown Files:")
|
||||
for file_path in sync_engine.list_local_files():
|
||||
tasks = sync_engine._read_list_file(file_path)
|
||||
completed = sum(
|
||||
1 for _, status, _, _ in tasks if status in ["complete", "cleared"]
|
||||
)
|
||||
total = len(tasks)
|
||||
click.echo(f" {file_path.name}: {completed}/{total} completed")
|
||||
|
||||
|
||||
@godspeed.command()
|
||||
def test_connection():
|
||||
"""Test connection to Godspeed API with SSL diagnostics."""
|
||||
import requests
|
||||
import ssl
|
||||
import socket
|
||||
|
||||
click.echo("Testing connection to Godspeed API...")
|
||||
|
||||
# Check if SSL bypass is enabled first
|
||||
disable_ssl = os.getenv("GODSPEED_DISABLE_SSL_VERIFY", "").lower() == "true"
|
||||
if disable_ssl:
|
||||
click.echo("⚠️ SSL verification is disabled (GODSPEED_DISABLE_SSL_VERIFY=true)")
|
||||
|
||||
# Test basic connectivity
|
||||
ssl_error_occurred = False
|
||||
try:
|
||||
response = requests.get("https://api.godspeedapp.com", timeout=10)
|
||||
click.echo("✓ Basic HTTPS connection successful")
|
||||
except requests.exceptions.SSLError as e:
|
||||
ssl_error_occurred = True
|
||||
click.echo(f"✗ SSL Error: {e}")
|
||||
if not disable_ssl:
|
||||
click.echo("\n💡 Try setting: export GODSPEED_DISABLE_SSL_VERIFY=true")
|
||||
except requests.exceptions.ConnectionError as e:
|
||||
click.echo(f"✗ Connection Error: {e}")
|
||||
return
|
||||
except Exception as e:
|
||||
click.echo(f"✗ Unexpected Error: {e}")
|
||||
return
|
||||
|
||||
# Test with SSL bypass if enabled and there was an SSL error
|
||||
if disable_ssl and ssl_error_occurred:
|
||||
try:
|
||||
response = requests.get(
|
||||
"https://api.godspeedapp.com", verify=False, timeout=10
|
||||
)
|
||||
click.echo("✓ Connection successful with SSL bypass")
|
||||
except Exception as e:
|
||||
click.echo(f"✗ Connection failed even with SSL bypass: {e}")
|
||||
return
|
||||
|
||||
# Test authentication if credentials available
|
||||
email, password, token = get_credentials()
|
||||
if token or (email and password):
|
||||
try:
|
||||
client = GodspeedClient(email=email, password=password, token=token)
|
||||
lists = client.get_lists()
|
||||
click.echo(f"✓ Authentication successful, found {len(lists)} lists")
|
||||
except Exception as e:
|
||||
click.echo(f"✗ Authentication failed: {e}")
|
||||
else:
|
||||
click.echo("ℹ️ No credentials provided for authentication test")
|
||||
|
||||
click.echo("\nConnection test complete!")
|
||||
|
||||
|
||||
@godspeed.command()
|
||||
def open():
|
||||
"""Open the sync directory in the default file manager."""
|
||||
sync_dir = get_sync_directory()
|
||||
|
||||
if not sync_dir.exists():
|
||||
click.echo(f"Sync directory does not exist: {sync_dir}", err=True)
|
||||
click.echo("Run 'godspeed download' or 'godspeed sync' to initialize.")
|
||||
return
|
||||
|
||||
import subprocess
|
||||
import platform
|
||||
|
||||
system = platform.system()
|
||||
try:
|
||||
if system == "Darwin": # macOS
|
||||
subprocess.run(["open", str(sync_dir)])
|
||||
elif system == "Windows":
|
||||
subprocess.run(["explorer", str(sync_dir)])
|
||||
else: # Linux
|
||||
subprocess.run(["xdg-open", str(sync_dir)])
|
||||
|
||||
click.echo(f"Opened sync directory: {sync_dir}")
|
||||
except Exception as e:
|
||||
click.echo(f"Could not open directory: {e}", err=True)
|
||||
click.echo(f"Sync directory is: {sync_dir}")
|
||||
|
||||
|
||||
class TaskSweeper:
|
||||
"""Sweeps incomplete tasks from markdown files into Godspeed Inbox."""
|
||||
|
||||
def __init__(self, notes_dir: Path, godspeed_dir: Path, dry_run: bool = False):
|
||||
self.notes_dir = Path(notes_dir)
|
||||
self.godspeed_dir = Path(godspeed_dir)
|
||||
self.dry_run = dry_run
|
||||
self.inbox_file = self.godspeed_dir / "Inbox.md"
|
||||
|
||||
# Try to use the sync engine for consistent ID generation and formatting
|
||||
try:
|
||||
self.sync_engine = GodspeedSync(None, str(godspeed_dir))
|
||||
except Exception:
|
||||
# Fallback parsing if sync engine fails
|
||||
self.sync_engine = None
|
||||
|
||||
def _parse_task_line_fallback(self, line: str):
|
||||
"""Fallback task parsing if sync engine not available."""
|
||||
import re
|
||||
import uuid
|
||||
|
||||
# Match patterns like: - [ ] Task title <!-- id:abc123 -->
|
||||
task_pattern = (
|
||||
r"^\s*-\s*\[([xX\s\-])\]\s*(.+?)(?:\s*<!--\s*id:(\w+)\s*-->)?\s*$"
|
||||
)
|
||||
match = re.match(task_pattern, line.strip())
|
||||
|
||||
if not match:
|
||||
return None
|
||||
|
||||
checkbox, title_and_notes, local_id = match.groups()
|
||||
|
||||
# Determine status
|
||||
if checkbox.lower() == "x":
|
||||
status = "complete"
|
||||
elif checkbox == "-":
|
||||
status = "cleared"
|
||||
else:
|
||||
status = "incomplete"
|
||||
|
||||
# Extract title (remove any inline notes after <!--)
|
||||
title = title_and_notes.split("<!--")[0].strip()
|
||||
|
||||
# Generate ID if missing
|
||||
if not local_id:
|
||||
if hasattr(self, "sync_engine") and self.sync_engine:
|
||||
local_id = self.sync_engine._generate_local_id()
|
||||
else:
|
||||
import uuid
|
||||
|
||||
local_id = str(uuid.uuid4())[:8]
|
||||
|
||||
return local_id, status, title, ""
|
||||
|
||||
def _parse_markdown_file(self, file_path: Path):
|
||||
"""Parse a markdown file and extract tasks and non-task content."""
|
||||
if not file_path.exists():
|
||||
return [], []
|
||||
|
||||
tasks = []
|
||||
non_task_lines = []
|
||||
|
||||
try:
|
||||
import builtins
|
||||
|
||||
with builtins.open(str(file_path), "r", encoding="utf-8") as f:
|
||||
lines = f.readlines()
|
||||
except Exception as e:
|
||||
click.echo(f" ⚠️ Error reading {file_path}: {e}")
|
||||
return [], []
|
||||
|
||||
for i, line in enumerate(lines):
|
||||
line = line.rstrip()
|
||||
|
||||
# Check if this line looks like a task
|
||||
if line.strip().startswith("- ["):
|
||||
# Always use fallback parsing
|
||||
parsed = self._parse_task_line_fallback(line)
|
||||
if parsed:
|
||||
tasks.append(parsed)
|
||||
continue
|
||||
|
||||
# Not a task, keep as regular content
|
||||
non_task_lines.append(line)
|
||||
|
||||
return tasks, non_task_lines
|
||||
|
||||
def _write_tasks_to_file(self, file_path: Path, tasks):
|
||||
"""Write tasks to a markdown file."""
|
||||
if not tasks:
|
||||
return
|
||||
|
||||
file_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
import builtins
|
||||
|
||||
# Read existing content if file exists
|
||||
existing_content = ""
|
||||
if file_path.exists():
|
||||
with builtins.open(str(file_path), "r", encoding="utf-8") as f:
|
||||
existing_content = f.read()
|
||||
|
||||
# Format new tasks
|
||||
new_task_lines = []
|
||||
for local_id, status, title, notes in tasks:
|
||||
if self.sync_engine:
|
||||
formatted = self.sync_engine._format_task_line(
|
||||
local_id, status, title, notes
|
||||
)
|
||||
else:
|
||||
# Fallback formatting
|
||||
checkbox = {"incomplete": "[ ]", "complete": "[x]", "cleared": "[-]"}[
|
||||
status
|
||||
]
|
||||
formatted = f"- {checkbox} {title} <!-- id:{local_id} -->"
|
||||
if notes:
|
||||
formatted += f"\n {notes}"
|
||||
|
||||
new_task_lines.append(formatted)
|
||||
|
||||
# Combine with existing content
|
||||
if existing_content.strip():
|
||||
new_content = (
|
||||
existing_content.rstrip() + "\n\n" + "\n".join(new_task_lines) + "\n"
|
||||
)
|
||||
else:
|
||||
new_content = "\n".join(new_task_lines) + "\n"
|
||||
|
||||
with builtins.open(str(file_path), "w", encoding="utf-8") as f:
|
||||
f.write(new_content)
|
||||
|
||||
def _clean_file(self, file_path: Path, non_task_lines):
|
||||
"""Remove tasks from original file, keeping only non-task content."""
|
||||
import builtins
|
||||
|
||||
if not non_task_lines or all(not line.strip() for line in non_task_lines):
|
||||
# File would be empty, delete it
|
||||
if not self.dry_run:
|
||||
file_path.unlink()
|
||||
click.echo(f" 🗑️ Would delete empty file: {file_path}")
|
||||
else:
|
||||
# Write back non-task content
|
||||
cleaned_content = "\n".join(non_task_lines).strip()
|
||||
if cleaned_content:
|
||||
cleaned_content += "\n"
|
||||
|
||||
if not self.dry_run:
|
||||
with builtins.open(str(file_path), "w", encoding="utf-8") as f:
|
||||
f.write(cleaned_content)
|
||||
click.echo(f" ✂️ Cleaned file (removed tasks): {file_path}")
|
||||
|
||||
def find_markdown_files(self):
|
||||
"""Find all markdown files in the notes directory, excluding Godspeed directory."""
|
||||
markdown_files = []
|
||||
|
||||
for md_file in self.notes_dir.rglob("*.md"):
|
||||
# Skip files in the Godspeed directory
|
||||
if (
|
||||
self.godspeed_dir in md_file.parents
|
||||
or md_file.parent == self.godspeed_dir
|
||||
):
|
||||
continue
|
||||
|
||||
# Skip hidden files and directories
|
||||
if any(part.startswith(".") for part in md_file.parts):
|
||||
continue
|
||||
|
||||
markdown_files.append(md_file)
|
||||
|
||||
return sorted(markdown_files)
|
||||
|
||||
def sweep_tasks(self):
|
||||
"""Sweep incomplete tasks from all markdown files into Inbox."""
|
||||
click.echo(f"🧹 Sweeping incomplete tasks from: {self.notes_dir}")
|
||||
click.echo(f"📥 Target Inbox: {self.inbox_file}")
|
||||
click.echo(f"🔍 Dry run: {self.dry_run}")
|
||||
click.echo("=" * 60)
|
||||
|
||||
markdown_files = self.find_markdown_files()
|
||||
click.echo(f"\n📁 Found {len(markdown_files)} markdown files to process")
|
||||
|
||||
swept_tasks = []
|
||||
processed_files = []
|
||||
|
||||
for file_path in markdown_files:
|
||||
try:
|
||||
rel_path = file_path.relative_to(self.notes_dir)
|
||||
rel_path_str = str(rel_path)
|
||||
except Exception as e:
|
||||
click.echo(f"Error getting relative path for {file_path}: {e}")
|
||||
rel_path_str = str(file_path.name)
|
||||
|
||||
click.echo(f"\n📄 Processing: {rel_path_str}")
|
||||
|
||||
tasks, non_task_lines = self._parse_markdown_file(file_path)
|
||||
|
||||
if not tasks:
|
||||
click.echo(f" ℹ️ No tasks found")
|
||||
continue
|
||||
if not tasks:
|
||||
click.echo(f" ℹ️ No tasks found")
|
||||
continue
|
||||
|
||||
# Separate incomplete tasks from completed/cleared ones
|
||||
incomplete_tasks = []
|
||||
complete_tasks = []
|
||||
|
||||
for task in tasks:
|
||||
local_id, status, title, notes = task
|
||||
if status == "incomplete":
|
||||
incomplete_tasks.append(task)
|
||||
else:
|
||||
complete_tasks.append(task)
|
||||
|
||||
if incomplete_tasks:
|
||||
click.echo(f" 🔄 Found {len(incomplete_tasks)} incomplete tasks:")
|
||||
for _, status, title, notes in incomplete_tasks:
|
||||
click.echo(f" • {title}")
|
||||
if notes:
|
||||
click.echo(f" Notes: {notes}")
|
||||
|
||||
# Add source file annotation with clean task IDs
|
||||
annotated_tasks = []
|
||||
for local_id, status, title, notes in incomplete_tasks:
|
||||
# Generate a fresh ID for swept tasks to avoid conflicts
|
||||
if self.sync_engine:
|
||||
fresh_id = self.sync_engine._generate_local_id()
|
||||
else:
|
||||
import uuid
|
||||
|
||||
fresh_id = str(uuid.uuid4())[:8]
|
||||
|
||||
# Add source info to notes
|
||||
source_notes = f"From: {rel_path_str}"
|
||||
if notes:
|
||||
combined_notes = f"{notes}\n{source_notes}"
|
||||
else:
|
||||
combined_notes = source_notes
|
||||
annotated_tasks.append((fresh_id, status, title, combined_notes))
|
||||
|
||||
swept_tasks.extend(annotated_tasks)
|
||||
processed_files.append(str(rel_path))
|
||||
|
||||
if complete_tasks:
|
||||
click.echo(
|
||||
f" ✅ Keeping {len(complete_tasks)} completed/cleared tasks in place"
|
||||
)
|
||||
|
||||
# Reconstruct remaining content (non-tasks + completed tasks)
|
||||
remaining_content = non_task_lines.copy()
|
||||
|
||||
# Add completed/cleared tasks back to remaining content
|
||||
if complete_tasks:
|
||||
remaining_content.append("") # Empty line before tasks
|
||||
for task in complete_tasks:
|
||||
if self.sync_engine:
|
||||
formatted = self.sync_engine._format_task_line(*task)
|
||||
else:
|
||||
local_id, status, title, notes = task
|
||||
checkbox = {
|
||||
"incomplete": "[ ]",
|
||||
"complete": "[x]",
|
||||
"cleared": "[-]",
|
||||
}[status]
|
||||
formatted = f"- {checkbox} {title} <!-- id:{local_id} -->"
|
||||
if notes:
|
||||
formatted += f"\n {notes}"
|
||||
remaining_content.append(formatted)
|
||||
|
||||
# Clean the original file
|
||||
if incomplete_tasks:
|
||||
self._clean_file(file_path, remaining_content)
|
||||
|
||||
# Write swept tasks to Inbox
|
||||
if swept_tasks:
|
||||
click.echo(f"\n📥 Writing {len(swept_tasks)} tasks to Inbox...")
|
||||
if not self.dry_run:
|
||||
self._write_tasks_to_file(self.inbox_file, swept_tasks)
|
||||
click.echo(f" ✅ Inbox updated: {self.inbox_file}")
|
||||
|
||||
# Summary
|
||||
click.echo(f"\n" + "=" * 60)
|
||||
click.echo(f"📊 SWEEP SUMMARY:")
|
||||
click.echo(f" • Files processed: {len(processed_files)}")
|
||||
click.echo(f" • Tasks swept: {len(swept_tasks)}")
|
||||
click.echo(f" • Target: {self.inbox_file}")
|
||||
|
||||
if self.dry_run:
|
||||
click.echo(f"\n⚠️ DRY RUN - No files were actually modified")
|
||||
click.echo(f" Run without --dry-run to perform the sweep")
|
||||
|
||||
return {
|
||||
"swept_tasks": len(swept_tasks),
|
||||
"processed_files": processed_files,
|
||||
"inbox_file": str(self.inbox_file),
|
||||
}
|
||||
|
||||
|
||||
@godspeed.command()
|
||||
@click.argument(
|
||||
"notes_dir",
|
||||
type=click.Path(exists=True, file_okay=False, dir_okay=True, path_type=Path),
|
||||
required=False,
|
||||
)
|
||||
@click.argument(
|
||||
"godspeed_dir",
|
||||
type=click.Path(file_okay=False, dir_okay=True, path_type=Path),
|
||||
required=False,
|
||||
)
|
||||
@click.option(
|
||||
"--dry-run", is_flag=True, help="Show what would be done without making changes"
|
||||
)
|
||||
def sweep(notes_dir, godspeed_dir, dry_run):
|
||||
"""Sweep incomplete tasks from markdown files into Godspeed Inbox.
|
||||
|
||||
NOTES_DIR: Directory containing markdown files with tasks to sweep (optional, defaults to $NOTES_DIR)
|
||||
GODSPEED_DIR: Godspeed sync directory (optional, defaults to sync directory)
|
||||
"""
|
||||
# Handle notes_dir default from environment
|
||||
if notes_dir is None:
|
||||
notes_dir_env = os.getenv("NOTES_DIR")
|
||||
if not notes_dir_env:
|
||||
click.echo(
|
||||
"❌ No notes directory specified and $NOTES_DIR environment variable not set",
|
||||
err=True,
|
||||
)
|
||||
click.echo("Usage: godspeed sweep <notes_dir> [godspeed_dir]", err=True)
|
||||
click.echo(
|
||||
" or: export NOTES_DIR=/path/to/notes && godspeed sweep", err=True
|
||||
)
|
||||
sys.exit(1)
|
||||
notes_dir = Path(notes_dir_env)
|
||||
if not notes_dir.exists():
|
||||
click.echo(
|
||||
f"❌ Notes directory from $NOTES_DIR does not exist: {notes_dir}",
|
||||
err=True,
|
||||
)
|
||||
sys.exit(1)
|
||||
if not notes_dir.is_dir():
|
||||
click.echo(
|
||||
f"❌ Notes path from $NOTES_DIR is not a directory: {notes_dir}",
|
||||
err=True,
|
||||
)
|
||||
sys.exit(1)
|
||||
|
||||
if godspeed_dir is None:
|
||||
godspeed_dir = get_sync_directory()
|
||||
|
||||
# Ensure we have Path objects
|
||||
notes_dir = Path(notes_dir)
|
||||
godspeed_dir = Path(godspeed_dir)
|
||||
|
||||
try:
|
||||
sweeper = TaskSweeper(notes_dir, godspeed_dir, dry_run)
|
||||
result = sweeper.sweep_tasks()
|
||||
|
||||
if result["swept_tasks"] > 0:
|
||||
click.echo(f"\n🎉 Successfully swept {result['swept_tasks']} tasks!")
|
||||
if not dry_run:
|
||||
click.echo(f"💡 Next steps:")
|
||||
click.echo(f" 1. Review tasks in: {result['inbox_file']}")
|
||||
click.echo(f" 2. Run 'godspeed upload' to sync to API")
|
||||
click.echo(
|
||||
f" 3. Organize tasks into appropriate lists in Godspeed app"
|
||||
)
|
||||
else:
|
||||
click.echo(f"\n✨ No incomplete tasks found to sweep.")
|
||||
|
||||
except Exception as e:
|
||||
click.echo(f"❌ Error during sweep: {e}", err=True)
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
godspeed()
|
||||
727
src/cli/sync.py
727
src/cli/sync.py
@@ -2,19 +2,204 @@ import click
|
||||
import asyncio
|
||||
import os
|
||||
import sys
|
||||
from rich.progress import Progress, SpinnerColumn, MofNCompleteColumn
|
||||
import signal
|
||||
import json
|
||||
import time
|
||||
from datetime import datetime, timedelta
|
||||
from pathlib import Path
|
||||
from rich.progress import Progress, SpinnerColumn, MofNCompleteColumn
|
||||
|
||||
from src.utils.mail_utils.helpers import ensure_directory_exists
|
||||
from src.utils.calendar_utils import save_events_to_vdir, save_events_to_file
|
||||
from src.services.microsoft_graph.calendar import fetch_calendar_events
|
||||
from src.utils.notifications import notify_new_emails
|
||||
from src.services.microsoft_graph.calendar import (
|
||||
fetch_calendar_events,
|
||||
sync_local_calendar_changes,
|
||||
get_last_sync_time,
|
||||
detect_deleted_events,
|
||||
)
|
||||
from src.services.microsoft_graph.mail import (
|
||||
fetch_mail_async,
|
||||
archive_mail_async,
|
||||
delete_mail_async,
|
||||
synchronize_maildir_async,
|
||||
process_outbox_async,
|
||||
)
|
||||
from src.services.microsoft_graph.auth import get_access_token
|
||||
from src.services.godspeed.client import GodspeedClient
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
|
||||
# Timing state management
|
||||
def get_sync_state_file():
|
||||
"""Get the path to the sync state file."""
|
||||
return os.path.expanduser("~/.local/share/luk/sync_state.json")
|
||||
|
||||
|
||||
def load_sync_state():
|
||||
"""Load the sync state from file."""
|
||||
state_file = get_sync_state_file()
|
||||
if os.path.exists(state_file):
|
||||
try:
|
||||
with open(state_file, "r") as f:
|
||||
return json.load(f)
|
||||
except Exception:
|
||||
pass
|
||||
return {
|
||||
"last_godspeed_sync": 0,
|
||||
"last_sweep_date": None,
|
||||
"sweep_completed_today": False,
|
||||
}
|
||||
|
||||
|
||||
def save_sync_state(state):
|
||||
"""Save the sync state to file."""
|
||||
state_file = get_sync_state_file()
|
||||
os.makedirs(os.path.dirname(state_file), exist_ok=True)
|
||||
with open(state_file, "w") as f:
|
||||
json.dump(state, f, indent=2)
|
||||
|
||||
|
||||
def should_run_godspeed_sync():
|
||||
"""Check if Godspeed sync should run (every 15 minutes)."""
|
||||
state = load_sync_state()
|
||||
current_time = time.time()
|
||||
last_sync = state.get("last_godspeed_sync", 0)
|
||||
return current_time - last_sync >= 900 # 15 minutes in seconds
|
||||
|
||||
|
||||
def should_run_sweep():
|
||||
"""Check if sweep should run (once after 6pm each day)."""
|
||||
state = load_sync_state()
|
||||
current_time = datetime.now()
|
||||
|
||||
# Check if it's after 6 PM
|
||||
if current_time.hour < 18:
|
||||
return False
|
||||
|
||||
# Check if we've already swept today
|
||||
today_str = current_time.strftime("%Y-%m-%d")
|
||||
last_sweep_date = state.get("last_sweep_date")
|
||||
|
||||
return last_sweep_date != today_str
|
||||
|
||||
|
||||
def get_godspeed_sync_directory():
|
||||
"""Get Godspeed sync directory from environment or default."""
|
||||
sync_dir = os.getenv("GODSPEED_SYNC_DIR")
|
||||
if sync_dir:
|
||||
return Path(sync_dir)
|
||||
|
||||
# Default to ~/Documents/Godspeed or ~/.local/share/gtd-terminal-tools/godspeed
|
||||
home = Path.home()
|
||||
|
||||
# Try Documents first
|
||||
docs_dir = home / "Documents" / "Godspeed"
|
||||
if docs_dir.parent.exists():
|
||||
return docs_dir
|
||||
|
||||
# Fall back to data directory
|
||||
data_dir = home / ".local" / "share" / "luk" / "godspeed"
|
||||
return data_dir
|
||||
|
||||
|
||||
def get_godspeed_credentials():
|
||||
"""Get Godspeed credentials from environment."""
|
||||
email = os.getenv("GODSPEED_EMAIL")
|
||||
password = os.getenv("GODSPEED_PASSWORD")
|
||||
token = os.getenv("GODSPEED_TOKEN")
|
||||
return email, password, token
|
||||
|
||||
|
||||
async def run_godspeed_sync(progress=None):
|
||||
"""Run Godspeed bidirectional sync."""
|
||||
try:
|
||||
email, password, token = get_godspeed_credentials()
|
||||
if not (token or (email and password)):
|
||||
if progress:
|
||||
progress.console.print(
|
||||
"[yellow]⚠️ Skipping Godspeed sync: No credentials configured[/yellow]"
|
||||
)
|
||||
return False
|
||||
|
||||
sync_dir = get_godspeed_sync_directory()
|
||||
|
||||
if progress:
|
||||
progress.console.print(
|
||||
f"[cyan]🔄 Running Godspeed sync to {sync_dir}...[/cyan]"
|
||||
)
|
||||
|
||||
client = GodspeedClient(email=email, password=password, token=token)
|
||||
sync_engine = GodspeedSync(client, sync_dir)
|
||||
sync_engine.sync_bidirectional()
|
||||
|
||||
# Update sync state
|
||||
state = load_sync_state()
|
||||
state["last_godspeed_sync"] = time.time()
|
||||
save_sync_state(state)
|
||||
|
||||
if progress:
|
||||
progress.console.print("[green]✅ Godspeed sync completed[/green]")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
if progress:
|
||||
progress.console.print(f"[red]❌ Godspeed sync failed: {e}[/red]")
|
||||
return False
|
||||
|
||||
|
||||
async def run_task_sweep(progress=None):
|
||||
"""Run task sweep from notes directory to Godspeed inbox."""
|
||||
try:
|
||||
from src.cli.godspeed import TaskSweeper
|
||||
|
||||
notes_dir_env = os.getenv("NOTES_DIR")
|
||||
if not notes_dir_env:
|
||||
if progress:
|
||||
progress.console.print(
|
||||
"[yellow]⚠️ Skipping task sweep: $NOTES_DIR not configured[/yellow]"
|
||||
)
|
||||
return False
|
||||
|
||||
notes_dir = Path(notes_dir_env)
|
||||
if not notes_dir.exists():
|
||||
if progress:
|
||||
progress.console.print(
|
||||
f"[yellow]⚠️ Skipping task sweep: Notes directory does not exist: {notes_dir}[/yellow]"
|
||||
)
|
||||
return False
|
||||
|
||||
godspeed_dir = get_godspeed_sync_directory()
|
||||
|
||||
if progress:
|
||||
progress.console.print(
|
||||
f"[cyan]🧹 Running task sweep from {notes_dir} to {godspeed_dir}...[/cyan]"
|
||||
)
|
||||
|
||||
sweeper = TaskSweeper(notes_dir, godspeed_dir, dry_run=False)
|
||||
result = sweeper.sweep_tasks()
|
||||
|
||||
# Update sweep state
|
||||
state = load_sync_state()
|
||||
state["last_sweep_date"] = datetime.now().strftime("%Y-%m-%d")
|
||||
save_sync_state(state)
|
||||
|
||||
if result["swept_tasks"] > 0:
|
||||
if progress:
|
||||
progress.console.print(
|
||||
f"[green]✅ Task sweep completed: {result['swept_tasks']} tasks swept[/green]"
|
||||
)
|
||||
else:
|
||||
if progress:
|
||||
progress.console.print(
|
||||
"[green]✅ Task sweep completed: No tasks to sweep[/green]"
|
||||
)
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
if progress:
|
||||
progress.console.print(f"[red]❌ Task sweep failed: {e}[/red]")
|
||||
return False
|
||||
|
||||
|
||||
# Function to create Maildir structure
|
||||
@@ -33,6 +218,11 @@ def create_maildir_structure(base_path):
|
||||
ensure_directory_exists(os.path.join(base_path, "tmp"))
|
||||
ensure_directory_exists(os.path.join(base_path, ".Archives"))
|
||||
ensure_directory_exists(os.path.join(base_path, ".Trash", "cur"))
|
||||
# Create outbox structure for sending emails
|
||||
ensure_directory_exists(os.path.join(base_path, "outbox", "new"))
|
||||
ensure_directory_exists(os.path.join(base_path, "outbox", "cur"))
|
||||
ensure_directory_exists(os.path.join(base_path, "outbox", "tmp"))
|
||||
ensure_directory_exists(os.path.join(base_path, "outbox", "failed"))
|
||||
|
||||
|
||||
async def fetch_calendar_async(
|
||||
@@ -77,11 +267,12 @@ async def fetch_calendar_async(
|
||||
# Update progress bar with total events
|
||||
progress.update(task_id, total=total_events)
|
||||
|
||||
# Define org_vdir_path up front if vdir_path is specified
|
||||
org_vdir_path = os.path.join(vdir_path, org_name) if vdir_path else None
|
||||
|
||||
# Save events to appropriate format
|
||||
if not dry_run:
|
||||
if vdir_path:
|
||||
# Create org-specific directory within vdir path
|
||||
org_vdir_path = os.path.join(vdir_path, org_name)
|
||||
if vdir_path and org_vdir_path:
|
||||
progress.console.print(
|
||||
f"[cyan]Saving events to vdir: {org_vdir_path}[/cyan]"
|
||||
)
|
||||
@@ -98,7 +289,7 @@ async def fetch_calendar_async(
|
||||
events, f"{ics_path}/events_latest.ics", progress, task_id, dry_run
|
||||
)
|
||||
progress.console.print(
|
||||
f"[green]Finished saving events to ICS file[/green]"
|
||||
"[green]Finished saving events to ICS file[/green]"
|
||||
)
|
||||
else:
|
||||
# No destination specified
|
||||
@@ -154,7 +345,7 @@ async def fetch_calendar_async(
|
||||
progress.update(task_id, total=next_total_events)
|
||||
|
||||
if not dry_run:
|
||||
if vdir_path:
|
||||
if vdir_path and org_vdir_path:
|
||||
save_events_to_vdir(
|
||||
next_events, org_vdir_path, progress, task_id, dry_run
|
||||
)
|
||||
@@ -214,6 +405,8 @@ async def _sync_outlook_data(
|
||||
days_forward,
|
||||
continue_iteration,
|
||||
download_attachments,
|
||||
two_way_calendar,
|
||||
notify,
|
||||
):
|
||||
"""Synchronize data from external sources."""
|
||||
|
||||
@@ -221,7 +414,8 @@ async def _sync_outlook_data(
|
||||
vdir = os.path.expanduser(vdir)
|
||||
|
||||
# Save emails to Maildir
|
||||
maildir_path = os.getenv("MAILDIR_PATH", os.path.expanduser("~/Mail")) + f"/{org}"
|
||||
base_maildir_path = os.getenv("MAILDIR_PATH", os.path.expanduser("~/Mail"))
|
||||
maildir_path = base_maildir_path + f"/{org}"
|
||||
attachments_dir = os.path.join(maildir_path, "attachments")
|
||||
ensure_directory_exists(attachments_dir)
|
||||
create_maildir_structure(maildir_path)
|
||||
@@ -243,27 +437,74 @@ async def _sync_outlook_data(
|
||||
with progress:
|
||||
task_fetch = progress.add_task("[green]Syncing Inbox...", total=0)
|
||||
task_calendar = progress.add_task("[cyan]Fetching calendar...", total=0)
|
||||
task_local_calendar = progress.add_task(
|
||||
"[magenta]Syncing local calendar...", total=0
|
||||
)
|
||||
task_read = progress.add_task("[blue]Marking as read...", total=0)
|
||||
task_archive = progress.add_task("[yellow]Archiving mail...", total=0)
|
||||
task_delete = progress.add_task("[red]Deleting mail...", total=0)
|
||||
task_outbox = progress.add_task(
|
||||
"[bright_green]Sending outbound mail...", total=0
|
||||
)
|
||||
|
||||
# Stage 1: Synchronize local changes (read, archive, delete) to the server
|
||||
# Stage 1: Synchronize local changes (read, archive, delete, calendar) to the server
|
||||
progress.console.print(
|
||||
"[bold cyan]Step 1: Syncing local changes to server...[/bold cyan]"
|
||||
)
|
||||
|
||||
# Handle calendar sync first (if vdir is specified and two-way sync is enabled)
|
||||
calendar_sync_results = (0, 0)
|
||||
if vdir and two_way_calendar:
|
||||
org_vdir_path = os.path.join(os.path.expanduser(vdir), org)
|
||||
progress.console.print(
|
||||
f"[magenta]Checking for local calendar changes in {org_vdir_path}...[/magenta]"
|
||||
)
|
||||
calendar_sync_results = await sync_local_calendar_changes(
|
||||
headers, org_vdir_path, progress, task_local_calendar, dry_run
|
||||
)
|
||||
|
||||
# Handle mail changes and outbound email in parallel
|
||||
await asyncio.gather(
|
||||
synchronize_maildir_async(
|
||||
maildir_path, headers, progress, task_read, dry_run
|
||||
),
|
||||
archive_mail_async(maildir_path, headers, progress, task_archive, dry_run),
|
||||
delete_mail_async(maildir_path, headers, progress, task_delete, dry_run),
|
||||
process_outbox_async(
|
||||
base_maildir_path, org, headers, progress, task_outbox, dry_run
|
||||
),
|
||||
)
|
||||
progress.console.print("[bold green]Step 1: Local changes synced.[/bold green]")
|
||||
|
||||
# Report calendar sync results
|
||||
created, deleted = calendar_sync_results
|
||||
if two_way_calendar and (created > 0 or deleted > 0):
|
||||
progress.console.print(
|
||||
f"[magenta]📅 Two-way calendar sync: {created} events created, {deleted} events deleted[/magenta]"
|
||||
)
|
||||
elif two_way_calendar:
|
||||
progress.console.print(
|
||||
"[magenta]📅 Two-way calendar sync: No local changes detected[/magenta]"
|
||||
)
|
||||
|
||||
# Stage 2: Fetch new data from the server
|
||||
progress.console.print(
|
||||
"\n[bold cyan]Step 2: Fetching new data from server...[/bold cyan]"
|
||||
)
|
||||
|
||||
# Track messages before sync for notifications
|
||||
maildir_path = (
|
||||
os.getenv("MAILDIR_PATH", os.path.expanduser("~/Mail")) + f"/{org}"
|
||||
)
|
||||
messages_before = 0
|
||||
new_dir = os.path.join(maildir_path, "new")
|
||||
cur_dir = os.path.join(maildir_path, "cur")
|
||||
if notify:
|
||||
if os.path.exists(new_dir):
|
||||
messages_before += len([f for f in os.listdir(new_dir) if ".eml" in f])
|
||||
if os.path.exists(cur_dir):
|
||||
messages_before += len([f for f in os.listdir(cur_dir) if ".eml" in f])
|
||||
|
||||
await asyncio.gather(
|
||||
fetch_mail_async(
|
||||
maildir_path,
|
||||
@@ -287,11 +528,98 @@ async def _sync_outlook_data(
|
||||
continue_iteration,
|
||||
),
|
||||
)
|
||||
|
||||
# Send notification for new emails if enabled
|
||||
if notify and not dry_run:
|
||||
messages_after = 0
|
||||
if os.path.exists(new_dir):
|
||||
messages_after += len([f for f in os.listdir(new_dir) if ".eml" in f])
|
||||
if os.path.exists(cur_dir):
|
||||
messages_after += len([f for f in os.listdir(cur_dir) if ".eml" in f])
|
||||
|
||||
new_message_count = messages_after - messages_before
|
||||
if new_message_count > 0:
|
||||
notify_new_emails(new_message_count, org)
|
||||
|
||||
progress.console.print("[bold green]Step 2: New data fetched.[/bold green]")
|
||||
|
||||
# Stage 3: Run Godspeed operations based on timing
|
||||
progress.console.print(
|
||||
"\n[bold cyan]Step 3: Running Godspeed operations...[/bold cyan]"
|
||||
)
|
||||
|
||||
# Check if Godspeed sync should run (every 15 minutes)
|
||||
if should_run_godspeed_sync():
|
||||
await run_godspeed_sync(progress)
|
||||
else:
|
||||
progress.console.print("[dim]⏭️ Skipping Godspeed sync (not due yet)[/dim]")
|
||||
|
||||
# Check if task sweep should run (once after 6pm daily)
|
||||
if should_run_sweep():
|
||||
await run_task_sweep(progress)
|
||||
else:
|
||||
current_hour = datetime.now().hour
|
||||
if current_hour < 18:
|
||||
progress.console.print(
|
||||
"[dim]⏭️ Skipping task sweep (before 6 PM)[/dim]"
|
||||
)
|
||||
else:
|
||||
progress.console.print(
|
||||
"[dim]⏭️ Skipping task sweep (already completed today)[/dim]"
|
||||
)
|
||||
|
||||
progress.console.print(
|
||||
"[bold green]Step 3: Godspeed operations completed.[/bold green]"
|
||||
)
|
||||
|
||||
click.echo("Sync complete.")
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.group()
|
||||
def sync():
|
||||
"""Email and calendar synchronization."""
|
||||
pass
|
||||
|
||||
|
||||
def daemonize():
|
||||
"""Properly daemonize the process for Unix systems."""
|
||||
# First fork
|
||||
try:
|
||||
pid = os.fork()
|
||||
if pid > 0:
|
||||
# Parent exits
|
||||
sys.exit(0)
|
||||
except OSError as e:
|
||||
sys.stderr.write(f"Fork #1 failed: {e}\n")
|
||||
sys.exit(1)
|
||||
|
||||
# Decouple from parent environment
|
||||
os.chdir("/")
|
||||
os.setsid()
|
||||
os.umask(0)
|
||||
|
||||
# Second fork
|
||||
try:
|
||||
pid = os.fork()
|
||||
if pid > 0:
|
||||
# Parent exits
|
||||
sys.exit(0)
|
||||
except OSError as e:
|
||||
sys.stderr.write(f"Fork #2 failed: {e}\n")
|
||||
sys.exit(1)
|
||||
|
||||
# Redirect standard file descriptors
|
||||
sys.stdout.flush()
|
||||
sys.stderr.flush()
|
||||
si = open(os.devnull, "r")
|
||||
so = open(os.devnull, "a+")
|
||||
se = open(os.devnull, "a+")
|
||||
os.dup2(si.fileno(), sys.stdin.fileno())
|
||||
os.dup2(so.fileno(), sys.stdout.fileno())
|
||||
os.dup2(se.fileno(), sys.stderr.fileno())
|
||||
|
||||
|
||||
@sync.command()
|
||||
@click.option(
|
||||
"--dry-run",
|
||||
is_flag=True,
|
||||
@@ -335,13 +663,31 @@ async def _sync_outlook_data(
|
||||
help="Download email attachments",
|
||||
default=False,
|
||||
)
|
||||
@click.option(
|
||||
"--two-way-calendar",
|
||||
is_flag=True,
|
||||
help="Enable two-way calendar sync (sync local changes to server)",
|
||||
default=False,
|
||||
)
|
||||
@click.option(
|
||||
"--daemon",
|
||||
is_flag=True,
|
||||
help="Run in daemon mode.",
|
||||
default=False,
|
||||
)
|
||||
def sync(
|
||||
@click.option(
|
||||
"--dashboard",
|
||||
is_flag=True,
|
||||
help="Run with TUI dashboard.",
|
||||
default=False,
|
||||
)
|
||||
@click.option(
|
||||
"--notify",
|
||||
is_flag=True,
|
||||
help="Send macOS notifications for new email messages",
|
||||
default=False,
|
||||
)
|
||||
def run(
|
||||
dry_run,
|
||||
vdir,
|
||||
icsfile,
|
||||
@@ -350,21 +696,33 @@ def sync(
|
||||
days_forward,
|
||||
continue_iteration,
|
||||
download_attachments,
|
||||
two_way_calendar,
|
||||
daemon,
|
||||
dashboard,
|
||||
notify,
|
||||
):
|
||||
if daemon:
|
||||
asyncio.run(
|
||||
daemon_mode(
|
||||
dry_run,
|
||||
vdir,
|
||||
icsfile,
|
||||
org,
|
||||
days_back,
|
||||
days_forward,
|
||||
continue_iteration,
|
||||
download_attachments,
|
||||
)
|
||||
if dashboard:
|
||||
from .sync_dashboard import run_dashboard_sync
|
||||
|
||||
asyncio.run(run_dashboard_sync())
|
||||
elif daemon:
|
||||
from .sync_daemon import create_daemon_config, SyncDaemon
|
||||
|
||||
config = create_daemon_config(
|
||||
dry_run=dry_run,
|
||||
vdir=vdir,
|
||||
icsfile=icsfile,
|
||||
org=org,
|
||||
days_back=days_back,
|
||||
days_forward=days_forward,
|
||||
continue_iteration=continue_iteration,
|
||||
download_attachments=download_attachments,
|
||||
two_way_calendar=two_way_calendar,
|
||||
notify=notify,
|
||||
)
|
||||
|
||||
daemon_instance = SyncDaemon(config)
|
||||
daemon_instance.start()
|
||||
else:
|
||||
asyncio.run(
|
||||
_sync_outlook_data(
|
||||
@@ -376,10 +734,127 @@ def sync(
|
||||
days_forward,
|
||||
continue_iteration,
|
||||
download_attachments,
|
||||
two_way_calendar,
|
||||
notify,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
@sync.command()
|
||||
def stop():
|
||||
"""Stop the sync daemon."""
|
||||
pid_file = os.path.expanduser("~/.config/luk/luk.pid")
|
||||
|
||||
if not os.path.exists(pid_file):
|
||||
click.echo("Daemon is not running (no PID file found)")
|
||||
return
|
||||
|
||||
try:
|
||||
with open(pid_file, "r") as f:
|
||||
pid = int(f.read().strip())
|
||||
|
||||
# Send SIGTERM to process
|
||||
os.kill(pid, signal.SIGTERM)
|
||||
|
||||
# Remove PID file
|
||||
os.unlink(pid_file)
|
||||
|
||||
click.echo(f"Daemon stopped (PID {pid})")
|
||||
except (ValueError, ProcessLookupError, OSError) as e:
|
||||
click.echo(f"Error stopping daemon: {e}")
|
||||
# Clean up stale PID file
|
||||
if os.path.exists(pid_file):
|
||||
os.unlink(pid_file)
|
||||
|
||||
|
||||
@sync.command()
|
||||
def status():
|
||||
"""Check the status of the sync daemon."""
|
||||
pid_file = os.path.expanduser("~/.config/luk/luk.pid")
|
||||
|
||||
if not os.path.exists(pid_file):
|
||||
click.echo("Daemon is not running")
|
||||
return
|
||||
|
||||
try:
|
||||
with open(pid_file, "r") as f:
|
||||
pid = int(f.read().strip())
|
||||
|
||||
# Check if process exists
|
||||
os.kill(pid, 0) # Send signal 0 to check if process exists
|
||||
click.echo(f"Daemon is running (PID {pid})")
|
||||
except (ValueError, ProcessLookupError, OSError):
|
||||
click.echo("Daemon is not running (stale PID file)")
|
||||
# Clean up stale PID file
|
||||
os.unlink(pid_file)
|
||||
|
||||
|
||||
def check_calendar_changes(vdir_path, org):
|
||||
"""
|
||||
Check if there are local calendar changes that need syncing.
|
||||
|
||||
Args:
|
||||
vdir_path (str): Base vdir path
|
||||
org (str): Organization name
|
||||
|
||||
Returns:
|
||||
tuple: (has_changes, change_description)
|
||||
"""
|
||||
if not vdir_path:
|
||||
return False, "No vdir path configured"
|
||||
|
||||
org_vdir_path = os.path.join(os.path.expanduser(vdir_path), org)
|
||||
|
||||
if not os.path.exists(org_vdir_path):
|
||||
return False, "Calendar directory does not exist"
|
||||
|
||||
try:
|
||||
# Get last sync time
|
||||
last_sync_time = get_last_sync_time(org_vdir_path)
|
||||
|
||||
# Check if vdir directory has been modified since last sync
|
||||
vdir_mtime = os.path.getmtime(org_vdir_path)
|
||||
|
||||
if vdir_mtime > last_sync_time:
|
||||
# Check for specific types of changes
|
||||
deleted_events = detect_deleted_events(org_vdir_path)
|
||||
|
||||
# Count .ics files to detect new events
|
||||
import glob
|
||||
|
||||
ics_files = glob.glob(os.path.join(org_vdir_path, "*.ics"))
|
||||
|
||||
# Load previous state to compare
|
||||
state_file = os.path.join(org_vdir_path, ".sync_state.json")
|
||||
previous_state = {}
|
||||
if os.path.exists(state_file):
|
||||
try:
|
||||
import json
|
||||
|
||||
with open(state_file, "r") as f:
|
||||
previous_state = json.load(f)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
new_event_count = len(ics_files) - len(previous_state) + len(deleted_events)
|
||||
|
||||
if deleted_events or new_event_count > 0:
|
||||
changes = []
|
||||
if new_event_count > 0:
|
||||
changes.append(f"{new_event_count} new events")
|
||||
if deleted_events:
|
||||
changes.append(f"{len(deleted_events)} deleted events")
|
||||
|
||||
return True, ", ".join(changes)
|
||||
else:
|
||||
return True, "directory modified"
|
||||
|
||||
return False, "no changes detected"
|
||||
|
||||
except Exception as e:
|
||||
return False, f"error checking calendar: {str(e)}"
|
||||
|
||||
|
||||
async def daemon_mode(
|
||||
dry_run,
|
||||
vdir,
|
||||
@@ -389,20 +864,47 @@ async def daemon_mode(
|
||||
days_forward,
|
||||
continue_iteration,
|
||||
download_attachments,
|
||||
two_way_calendar,
|
||||
notify,
|
||||
):
|
||||
"""
|
||||
Run the script in daemon mode, periodically syncing emails.
|
||||
Run the script in daemon mode, periodically syncing emails and calendar.
|
||||
"""
|
||||
from src.services.microsoft_graph.mail import get_inbox_count_async
|
||||
from rich.console import Console
|
||||
from rich.panel import Panel
|
||||
from rich.text import Text
|
||||
from datetime import datetime
|
||||
import time
|
||||
|
||||
console = Console()
|
||||
sync_interval = 300 # 5 minutes
|
||||
check_interval = 10 # 10 seconds
|
||||
last_sync_time = time.time() - sync_interval # Force initial sync
|
||||
|
||||
def create_status_display(status_text, status_color="cyan"):
|
||||
"""Create a status panel for daemon mode."""
|
||||
timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
|
||||
content = Text()
|
||||
content.append(f"[{timestamp}] ", style="dim")
|
||||
content.append(status_text, style=status_color)
|
||||
|
||||
return Panel(
|
||||
content,
|
||||
title="📧 Email & Calendar Sync Daemon",
|
||||
border_style="blue",
|
||||
padding=(0, 1),
|
||||
)
|
||||
|
||||
# Initial display
|
||||
console.print(create_status_display("Starting daemon mode...", "green"))
|
||||
|
||||
while True:
|
||||
if time.time() - last_sync_time >= sync_interval:
|
||||
click.echo("[green]Performing full sync...[/green]")
|
||||
# Show full sync status
|
||||
console.clear()
|
||||
console.print(create_status_display("Performing full sync...", "green"))
|
||||
|
||||
# Perform a full sync
|
||||
await _sync_outlook_data(
|
||||
dry_run,
|
||||
@@ -413,45 +915,150 @@ async def daemon_mode(
|
||||
days_forward,
|
||||
continue_iteration,
|
||||
download_attachments,
|
||||
two_way_calendar,
|
||||
notify,
|
||||
)
|
||||
last_sync_time = time.time()
|
||||
|
||||
# Show completion
|
||||
console.print(create_status_display("Full sync completed ✅", "green"))
|
||||
else:
|
||||
# Perform a quick check
|
||||
click.echo("[cyan]Checking for new messages...[/cyan]")
|
||||
# Authenticate and get access token
|
||||
scopes = ["https://graph.microsoft.com/Mail.Read"]
|
||||
access_token, headers = get_access_token(scopes)
|
||||
remote_message_count = await get_inbox_count_async(headers)
|
||||
maildir_path = os.path.expanduser(f"~/Mail/{org}")
|
||||
local_message_count = len(
|
||||
[
|
||||
f
|
||||
for f in os.listdir(os.path.join(maildir_path, "new"))
|
||||
if ".eml" in f
|
||||
]
|
||||
) + len(
|
||||
[
|
||||
f
|
||||
for f in os.listdir(os.path.join(maildir_path, "cur"))
|
||||
if ".eml" in f
|
||||
]
|
||||
)
|
||||
if remote_message_count != local_message_count:
|
||||
click.echo(
|
||||
f"[yellow]New messages detected ({remote_message_count} / {local_message_count}), performing full sync...[/yellow]"
|
||||
# Show checking status
|
||||
console.clear()
|
||||
console.print(create_status_display("Checking for changes...", "cyan"))
|
||||
|
||||
try:
|
||||
# Authenticate and get access token for mail check
|
||||
scopes = ["https://graph.microsoft.com/Mail.Read"]
|
||||
access_token, headers = get_access_token(scopes)
|
||||
remote_message_count = await get_inbox_count_async(headers)
|
||||
maildir_path = os.path.expanduser(f"~/Mail/{org}")
|
||||
|
||||
# Count local messages
|
||||
new_dir = os.path.join(maildir_path, "new")
|
||||
cur_dir = os.path.join(maildir_path, "cur")
|
||||
local_message_count = 0
|
||||
|
||||
if os.path.exists(new_dir):
|
||||
local_message_count += len(
|
||||
[f for f in os.listdir(new_dir) if ".eml" in f]
|
||||
)
|
||||
if os.path.exists(cur_dir):
|
||||
local_message_count += len(
|
||||
[f for f in os.listdir(cur_dir) if ".eml" in f]
|
||||
)
|
||||
|
||||
mail_changes = remote_message_count != local_message_count
|
||||
|
||||
# Check for calendar changes if two-way sync is enabled
|
||||
calendar_changes = False
|
||||
calendar_change_desc = ""
|
||||
if two_way_calendar and vdir:
|
||||
calendar_changes, calendar_change_desc = check_calendar_changes(
|
||||
vdir, org
|
||||
)
|
||||
|
||||
# Check for outbound emails in outbox
|
||||
base_maildir_path = os.getenv(
|
||||
"MAILDIR_PATH", os.path.expanduser("~/Mail")
|
||||
)
|
||||
await _sync_outlook_data(
|
||||
dry_run,
|
||||
vdir,
|
||||
icsfile,
|
||||
org,
|
||||
days_back,
|
||||
days_forward,
|
||||
continue_iteration,
|
||||
download_attachments,
|
||||
outbox_new_dir = os.path.join(base_maildir_path, org, "outbox", "new")
|
||||
outbox_changes = False
|
||||
pending_email_count = 0
|
||||
|
||||
if os.path.exists(outbox_new_dir):
|
||||
pending_emails = [
|
||||
f for f in os.listdir(outbox_new_dir) if not f.startswith(".")
|
||||
]
|
||||
pending_email_count = len(pending_emails)
|
||||
outbox_changes = pending_email_count > 0
|
||||
|
||||
# Check Godspeed operations
|
||||
godspeed_sync_due = should_run_godspeed_sync()
|
||||
sweep_due = should_run_sweep()
|
||||
|
||||
# Determine what changed and show appropriate status
|
||||
changes_detected = (
|
||||
mail_changes
|
||||
or calendar_changes
|
||||
or outbox_changes
|
||||
or godspeed_sync_due
|
||||
or sweep_due
|
||||
)
|
||||
|
||||
if changes_detected:
|
||||
change_parts = []
|
||||
if mail_changes:
|
||||
change_parts.append(
|
||||
f"Mail: Remote {remote_message_count}, Local {local_message_count}"
|
||||
)
|
||||
if calendar_changes:
|
||||
change_parts.append(f"Calendar: {calendar_change_desc}")
|
||||
if outbox_changes:
|
||||
change_parts.append(f"Outbox: {pending_email_count} pending")
|
||||
if godspeed_sync_due:
|
||||
change_parts.append("Godspeed sync due")
|
||||
if sweep_due:
|
||||
change_parts.append("Task sweep due")
|
||||
|
||||
console.print(
|
||||
create_status_display(
|
||||
f"Changes detected! {' | '.join(change_parts)}. Starting sync...",
|
||||
"yellow",
|
||||
)
|
||||
)
|
||||
|
||||
# Sync if any changes detected
|
||||
if changes_detected:
|
||||
await _sync_outlook_data(
|
||||
dry_run,
|
||||
vdir,
|
||||
icsfile,
|
||||
org,
|
||||
days_back,
|
||||
days_forward,
|
||||
continue_iteration,
|
||||
download_attachments,
|
||||
two_way_calendar,
|
||||
notify,
|
||||
)
|
||||
last_sync_time = time.time()
|
||||
console.print(create_status_display("Sync completed ✅", "green"))
|
||||
else:
|
||||
status_parts = [
|
||||
f"Mail: Remote {remote_message_count}, Local {local_message_count}"
|
||||
]
|
||||
if two_way_calendar:
|
||||
status_parts.append(f"Calendar: {calendar_change_desc}")
|
||||
|
||||
status_parts.append(f"Outbox: {pending_email_count} pending")
|
||||
|
||||
# Add Godspeed status
|
||||
state = load_sync_state()
|
||||
last_godspeed = state.get("last_godspeed_sync", 0)
|
||||
minutes_since_godspeed = int((time.time() - last_godspeed) / 60)
|
||||
status_parts.append(f"Godspeed: {minutes_since_godspeed}m ago")
|
||||
|
||||
last_sweep = state.get("last_sweep_date")
|
||||
if last_sweep == datetime.now().strftime("%Y-%m-%d"):
|
||||
status_parts.append("Sweep: done today")
|
||||
else:
|
||||
current_hour = datetime.now().hour
|
||||
if current_hour >= 18:
|
||||
status_parts.append("Sweep: due")
|
||||
else:
|
||||
hours_until_sweep = 18 - current_hour
|
||||
status_parts.append(f"Sweep: in {hours_until_sweep}h")
|
||||
|
||||
console.print(
|
||||
create_status_display(
|
||||
f"No changes detected ({', '.join(status_parts)})",
|
||||
"green",
|
||||
)
|
||||
)
|
||||
except Exception as e:
|
||||
console.print(
|
||||
create_status_display(f"Error during check: {str(e)}", "red")
|
||||
)
|
||||
last_sync_time = time.time()
|
||||
else:
|
||||
click.echo("[green]No new messages detected.[/green]")
|
||||
|
||||
time.sleep(check_interval)
|
||||
|
||||
329
src/cli/sync_daemon.py
Normal file
329
src/cli/sync_daemon.py
Normal file
@@ -0,0 +1,329 @@
|
||||
"""Daemon mode with proper Unix logging."""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import logging
|
||||
import logging.handlers
|
||||
import asyncio
|
||||
import time
|
||||
import signal
|
||||
import json
|
||||
from pathlib import Path
|
||||
from datetime import datetime
|
||||
from typing import Optional, Dict, Any
|
||||
|
||||
from src.cli.sync import _sync_outlook_data, should_run_godspeed_sync, should_run_sweep
|
||||
from src.cli.sync import run_godspeed_sync, run_task_sweep, load_sync_state
|
||||
|
||||
|
||||
class SyncDaemon:
|
||||
"""Proper daemon with Unix logging."""
|
||||
|
||||
def __init__(self, config: Dict[str, Any]):
|
||||
self.config = config
|
||||
self.running = False
|
||||
self.pid_file = Path(
|
||||
config.get("pid_file", "~/.config/luk/luk.pid")
|
||||
).expanduser()
|
||||
self.log_file = Path(
|
||||
config.get("log_file", "~/.local/share/luk/luk.log")
|
||||
).expanduser()
|
||||
self.sync_interval = config.get("sync_interval", 300) # 5 minutes
|
||||
self.check_interval = config.get("check_interval", 10) # 10 seconds
|
||||
self.logger = self._setup_logging()
|
||||
|
||||
def _setup_logging(self) -> logging.Logger:
|
||||
"""Setup proper Unix logging."""
|
||||
logger = logging.getLogger("sync_daemon")
|
||||
logger.setLevel(logging.INFO)
|
||||
|
||||
# Ensure log directory exists
|
||||
self.log_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Rotating file handler (10MB max, keep 5 backups)
|
||||
handler = logging.handlers.RotatingFileHandler(
|
||||
self.log_file,
|
||||
maxBytes=10 * 1024 * 1024, # 10MB
|
||||
backupCount=5,
|
||||
encoding="utf-8",
|
||||
)
|
||||
|
||||
# Log format
|
||||
formatter = logging.Formatter(
|
||||
"%(asctime)s - %(name)s - %(levelname)s - %(message)s",
|
||||
datefmt="%Y-%m-%d %H:%M:%S",
|
||||
)
|
||||
handler.setFormatter(formatter)
|
||||
logger.addHandler(handler)
|
||||
|
||||
return logger
|
||||
|
||||
def daemonize(self) -> None:
|
||||
"""Properly daemonize the process for Unix systems."""
|
||||
# First fork
|
||||
try:
|
||||
pid = os.fork()
|
||||
if pid > 0:
|
||||
# Parent exits
|
||||
sys.exit(0)
|
||||
except OSError as e:
|
||||
sys.stderr.write(f"Fork #1 failed: {e}\n")
|
||||
sys.exit(1)
|
||||
|
||||
# Decouple from parent environment
|
||||
os.chdir("/")
|
||||
os.setsid()
|
||||
os.umask(0)
|
||||
|
||||
# Second fork
|
||||
try:
|
||||
pid = os.fork()
|
||||
if pid > 0:
|
||||
# Parent exits
|
||||
sys.exit(0)
|
||||
except OSError as e:
|
||||
sys.stderr.write(f"Fork #2 failed: {e}\n")
|
||||
sys.exit(1)
|
||||
|
||||
# Redirect standard file descriptors to /dev/null
|
||||
sys.stdout.flush()
|
||||
sys.stderr.flush()
|
||||
si = open(os.devnull, "r")
|
||||
so = open(os.devnull, "a+")
|
||||
se = open(os.devnull, "a+")
|
||||
os.dup2(si.fileno(), sys.stdin.fileno())
|
||||
os.dup2(so.fileno(), sys.stdout.fileno())
|
||||
os.dup2(se.fileno(), sys.stderr.fileno())
|
||||
|
||||
# Write PID file
|
||||
self.pid_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(self.pid_file, "w") as f:
|
||||
f.write(str(os.getpid()))
|
||||
|
||||
def start(self) -> None:
|
||||
"""Start the daemon."""
|
||||
# Check if already running
|
||||
if self.is_running():
|
||||
print(f"Daemon is already running (PID {self.get_pid()})")
|
||||
return
|
||||
|
||||
print("Starting sync daemon...")
|
||||
self.daemonize()
|
||||
|
||||
# Setup signal handlers
|
||||
signal.signal(signal.SIGTERM, self._signal_handler)
|
||||
signal.signal(signal.SIGINT, self._signal_handler)
|
||||
|
||||
self.logger.info("Sync daemon started")
|
||||
self.running = True
|
||||
|
||||
# Run the daemon loop
|
||||
asyncio.run(self._daemon_loop())
|
||||
|
||||
def stop(self) -> None:
|
||||
"""Stop the daemon."""
|
||||
if not self.is_running():
|
||||
print("Daemon is not running")
|
||||
return
|
||||
|
||||
try:
|
||||
pid = self.get_pid()
|
||||
os.kill(pid, signal.SIGTERM)
|
||||
|
||||
# Wait for process to exit
|
||||
for _ in range(10):
|
||||
try:
|
||||
os.kill(pid, 0) # Check if process exists
|
||||
time.sleep(0.5)
|
||||
except ProcessLookupError:
|
||||
break
|
||||
else:
|
||||
# Force kill if still running
|
||||
os.kill(pid, signal.SIGKILL)
|
||||
|
||||
# Remove PID file
|
||||
if self.pid_file.exists():
|
||||
self.pid_file.unlink()
|
||||
|
||||
print(f"Daemon stopped (PID {pid})")
|
||||
self.logger.info("Sync daemon stopped")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error stopping daemon: {e}")
|
||||
|
||||
def status(self) -> None:
|
||||
"""Check daemon status."""
|
||||
if not self.is_running():
|
||||
print("Daemon is not running")
|
||||
return
|
||||
|
||||
pid = self.get_pid()
|
||||
print(f"Daemon is running (PID {pid})")
|
||||
|
||||
# Show recent log entries
|
||||
try:
|
||||
with open(self.log_file, "r") as f:
|
||||
lines = f.readlines()
|
||||
if lines:
|
||||
print("\nRecent log entries:")
|
||||
for line in lines[-5:]:
|
||||
print(f" {line.strip()}")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def is_running(self) -> bool:
|
||||
"""Check if daemon is running."""
|
||||
if not self.pid_file.exists():
|
||||
return False
|
||||
|
||||
try:
|
||||
pid = self.get_pid()
|
||||
os.kill(pid, 0) # Check if process exists
|
||||
return True
|
||||
except (ValueError, ProcessLookupError, OSError):
|
||||
# Stale PID file, remove it
|
||||
if self.pid_file.exists():
|
||||
self.pid_file.unlink()
|
||||
return False
|
||||
|
||||
def get_pid(self) -> int:
|
||||
"""Get PID from file."""
|
||||
with open(self.pid_file, "r") as f:
|
||||
return int(f.read().strip())
|
||||
|
||||
def _signal_handler(self, signum, frame):
|
||||
"""Handle shutdown signals."""
|
||||
self.logger.info(f"Received signal {signum}, shutting down...")
|
||||
self.running = False
|
||||
|
||||
# Remove PID file
|
||||
if self.pid_file.exists():
|
||||
self.pid_file.unlink()
|
||||
|
||||
sys.exit(0)
|
||||
|
||||
async def _daemon_loop(self) -> None:
|
||||
"""Main daemon loop."""
|
||||
last_sync_time = time.time() - self.sync_interval # Force initial sync
|
||||
|
||||
while self.running:
|
||||
try:
|
||||
current_time = time.time()
|
||||
|
||||
if current_time - last_sync_time >= self.sync_interval:
|
||||
self.logger.info("Performing scheduled sync...")
|
||||
await self._perform_sync()
|
||||
last_sync_time = current_time
|
||||
self.logger.info("Scheduled sync completed")
|
||||
else:
|
||||
# Check for changes
|
||||
changes_detected = await self._check_for_changes()
|
||||
if changes_detected:
|
||||
self.logger.info("Changes detected, triggering sync...")
|
||||
await self._perform_sync()
|
||||
last_sync_time = current_time
|
||||
else:
|
||||
self.logger.debug("No changes detected")
|
||||
|
||||
await asyncio.sleep(self.check_interval)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error in daemon loop: {e}")
|
||||
await asyncio.sleep(30) # Wait before retrying
|
||||
|
||||
async def _perform_sync(self) -> None:
|
||||
"""Perform a full sync."""
|
||||
try:
|
||||
await _sync_outlook_data(
|
||||
dry_run=self.config.get("dry_run", False),
|
||||
vdir=self.config.get("vdir", "~/Calendar"),
|
||||
icsfile=self.config.get("icsfile"),
|
||||
org=self.config.get("org", "corteva"),
|
||||
days_back=self.config.get("days_back", 1),
|
||||
days_forward=self.config.get("days_forward", 30),
|
||||
continue_iteration=self.config.get("continue_iteration", False),
|
||||
download_attachments=self.config.get("download_attachments", False),
|
||||
two_way_calendar=self.config.get("two_way_calendar", False),
|
||||
notify=self.config.get("notify", False),
|
||||
)
|
||||
self.logger.info("Sync completed successfully")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Sync failed: {e}")
|
||||
|
||||
async def _check_for_changes(self) -> bool:
|
||||
"""Check if there are changes that require syncing."""
|
||||
try:
|
||||
# Check Godspeed operations
|
||||
godspeed_sync_due = should_run_godspeed_sync()
|
||||
sweep_due = should_run_sweep()
|
||||
|
||||
if godspeed_sync_due or sweep_due:
|
||||
self.logger.info("Godspeed operations due")
|
||||
return True
|
||||
|
||||
# Add other change detection logic here
|
||||
# For now, just return False
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error checking for changes: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def create_daemon_config(**kwargs) -> Dict[str, Any]:
|
||||
"""Create daemon configuration from command line args."""
|
||||
return {
|
||||
"dry_run": kwargs.get("dry_run", False),
|
||||
"vdir": kwargs.get("vdir", "~/Calendar"),
|
||||
"icsfile": kwargs.get("icsfile"),
|
||||
"org": kwargs.get("org", "corteva"),
|
||||
"days_back": kwargs.get("days_back", 1),
|
||||
"days_forward": kwargs.get("days_forward", 30),
|
||||
"continue_iteration": kwargs.get("continue_iteration", False),
|
||||
"download_attachments": kwargs.get("download_attachments", False),
|
||||
"two_way_calendar": kwargs.get("two_way_calendar", False),
|
||||
"notify": kwargs.get("notify", False),
|
||||
"pid_file": kwargs.get("pid_file", "~/.config/luk/luk.pid"),
|
||||
"log_file": kwargs.get("log_file", "~/.local/share/luk/luk.log"),
|
||||
"sync_interval": kwargs.get("sync_interval", 300),
|
||||
"check_interval": kwargs.get("check_interval", 10),
|
||||
}
|
||||
|
||||
|
||||
def main():
|
||||
"""Main daemon entry point."""
|
||||
import argparse
|
||||
|
||||
parser = argparse.ArgumentParser(description="Sync daemon management")
|
||||
parser.add_argument(
|
||||
"action", choices=["start", "stop", "status", "logs"], help="Action to perform"
|
||||
)
|
||||
parser.add_argument("--dry-run", action="store_true", help="Dry run mode")
|
||||
parser.add_argument("--org", default="corteva", help="Organization name")
|
||||
parser.add_argument("--vdir", default="~/Calendar", help="Calendar directory")
|
||||
parser.add_argument("--notify", action="store_true", help="Enable notifications")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
config = create_daemon_config(
|
||||
dry_run=args.dry_run, org=args.org, vdir=args.vdir, notify=args.notify
|
||||
)
|
||||
|
||||
daemon = SyncDaemon(config)
|
||||
|
||||
if args.action == "start":
|
||||
daemon.start()
|
||||
elif args.action == "stop":
|
||||
daemon.stop()
|
||||
elif args.action == "status":
|
||||
daemon.status()
|
||||
elif args.action == "logs":
|
||||
try:
|
||||
with open(daemon.log_file, "r") as f:
|
||||
print(f.read())
|
||||
except Exception as e:
|
||||
print(f"Error reading logs: {e}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
680
src/cli/sync_dashboard.py
Normal file
680
src/cli/sync_dashboard.py
Normal file
@@ -0,0 +1,680 @@
|
||||
"""TUI dashboard for sync progress with scrollable logs."""
|
||||
|
||||
from textual.app import App, ComposeResult
|
||||
from textual.containers import Container, Horizontal, Vertical
|
||||
from textual.widgets import (
|
||||
Header,
|
||||
Footer,
|
||||
Static,
|
||||
ProgressBar,
|
||||
Log,
|
||||
ListView,
|
||||
ListItem,
|
||||
Label,
|
||||
)
|
||||
from textual.reactive import reactive
|
||||
from textual.binding import Binding
|
||||
from rich.text import Text
|
||||
from datetime import datetime, timedelta
|
||||
import asyncio
|
||||
from typing import Dict, Any, Optional, List, Callable
|
||||
|
||||
# Default sync interval in seconds (5 minutes)
|
||||
DEFAULT_SYNC_INTERVAL = 300
|
||||
|
||||
# Futuristic spinner frames
|
||||
# SPINNER_FRAMES = ["⠋", "⠙", "⠹", "⠸", "⠼", "⠴", "⠦", "⠧", "⠇", "⠏"]
|
||||
# Alternative spinners you could use:
|
||||
# SPINNER_FRAMES = ["◢", "◣", "◤", "◥"] # Rotating triangle
|
||||
SPINNER_FRAMES = ["▰▱▱▱▱", "▰▰▱▱▱", "▰▰▰▱▱", "▰▰▰▰▱", "▰▰▰▰▰", "▱▰▰▰▰", "▱▱▰▰▰", "▱▱▱▰▰", "▱▱▱▱▰"] # Loading bar
|
||||
# SPINNER_FRAMES = ["⣾", "⣽", "⣻", "⢿", "⡿", "⣟", "⣯", "⣷"] # Braille dots
|
||||
# SPINNER_FRAMES = ["◐", "◓", "◑", "◒"] # Circle quarters
|
||||
# SPINNER_FRAMES = ["⠁", "⠂", "⠄", "⡀", "⢀", "⠠", "⠐", "⠈"] # Braille orbit
|
||||
|
||||
|
||||
class TaskStatus:
|
||||
"""Status constants for tasks."""
|
||||
|
||||
PENDING = "pending"
|
||||
RUNNING = "running"
|
||||
COMPLETED = "completed"
|
||||
ERROR = "error"
|
||||
|
||||
|
||||
class TaskListItem(ListItem):
|
||||
"""A list item representing a sync task."""
|
||||
|
||||
def __init__(self, task_id: str, task_name: str, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.task_id = task_id
|
||||
self.task_name = task_name
|
||||
self.status = TaskStatus.PENDING
|
||||
self.progress = 0
|
||||
self.total = 100
|
||||
self.spinner_frame = 0
|
||||
|
||||
def compose(self) -> ComposeResult:
|
||||
"""Compose the task item layout."""
|
||||
yield Static(self._build_content_text(), id=f"task-content-{self.task_id}")
|
||||
|
||||
def _get_status_icon(self) -> str:
|
||||
"""Get icon based on status."""
|
||||
if self.status == TaskStatus.RUNNING:
|
||||
return SPINNER_FRAMES[self.spinner_frame % len(SPINNER_FRAMES)]
|
||||
icons = {
|
||||
TaskStatus.PENDING: "○",
|
||||
TaskStatus.COMPLETED: "✓",
|
||||
TaskStatus.ERROR: "✗",
|
||||
}
|
||||
return icons.get(self.status, "○")
|
||||
|
||||
def advance_spinner(self) -> None:
|
||||
"""Advance the spinner to the next frame."""
|
||||
self.spinner_frame = (self.spinner_frame + 1) % len(SPINNER_FRAMES)
|
||||
|
||||
def _get_status_color(self) -> str:
|
||||
"""Get color based on status."""
|
||||
colors = {
|
||||
TaskStatus.PENDING: "dim",
|
||||
TaskStatus.RUNNING: "cyan",
|
||||
TaskStatus.COMPLETED: "bright_white",
|
||||
TaskStatus.ERROR: "red",
|
||||
}
|
||||
return colors.get(self.status, "white")
|
||||
|
||||
def _build_content_text(self) -> Text:
|
||||
"""Build the task content text."""
|
||||
icon = self._get_status_icon()
|
||||
color = self._get_status_color()
|
||||
|
||||
# Use green checkmark for completed, but white text for readability
|
||||
if self.status == TaskStatus.RUNNING:
|
||||
progress_pct = (
|
||||
int((self.progress / self.total) * 100) if self.total > 0 else 0
|
||||
)
|
||||
text = Text()
|
||||
text.append(f"{icon} ", style="cyan")
|
||||
text.append(f"{self.task_name} [{progress_pct}%]", style=color)
|
||||
return text
|
||||
elif self.status == TaskStatus.COMPLETED:
|
||||
text = Text()
|
||||
text.append(f"{icon} ", style="green") # Green checkmark
|
||||
text.append(f"{self.task_name} [Done]", style=color)
|
||||
return text
|
||||
elif self.status == TaskStatus.ERROR:
|
||||
text = Text()
|
||||
text.append(f"{icon} ", style="red")
|
||||
text.append(f"{self.task_name} [Error]", style=color)
|
||||
return text
|
||||
else:
|
||||
return Text(f"{icon} {self.task_name}", style=color)
|
||||
|
||||
def update_display(self) -> None:
|
||||
"""Update the display of this item."""
|
||||
try:
|
||||
content = self.query_one(f"#task-content-{self.task_id}", Static)
|
||||
content.update(self._build_content_text())
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
class SyncDashboard(App):
|
||||
"""TUI dashboard for sync operations."""
|
||||
|
||||
BINDINGS = [
|
||||
Binding("q", "quit", "Quit"),
|
||||
Binding("ctrl+c", "quit", "Quit"),
|
||||
Binding("s", "sync_now", "Sync Now"),
|
||||
Binding("r", "refresh", "Refresh"),
|
||||
Binding("+", "increase_interval", "+Interval"),
|
||||
Binding("-", "decrease_interval", "-Interval"),
|
||||
Binding("up", "cursor_up", "Up", show=False),
|
||||
Binding("down", "cursor_down", "Down", show=False),
|
||||
]
|
||||
|
||||
CSS = """
|
||||
.dashboard {
|
||||
height: 100%;
|
||||
layout: horizontal;
|
||||
}
|
||||
|
||||
.sidebar {
|
||||
width: 30;
|
||||
height: 100%;
|
||||
border: solid $primary;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.sidebar-title {
|
||||
text-style: bold;
|
||||
padding: 1;
|
||||
background: $primary-darken-2;
|
||||
}
|
||||
|
||||
.countdown-container {
|
||||
height: 3;
|
||||
padding: 0 1;
|
||||
border-top: solid $primary;
|
||||
background: $surface;
|
||||
}
|
||||
|
||||
.countdown-text {
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.main-panel {
|
||||
width: 1fr;
|
||||
height: 100%;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.task-header {
|
||||
height: 5;
|
||||
padding: 1;
|
||||
border-bottom: solid $primary;
|
||||
}
|
||||
|
||||
.task-name {
|
||||
text-style: bold;
|
||||
}
|
||||
|
||||
.progress-row {
|
||||
height: 3;
|
||||
padding: 0 1;
|
||||
}
|
||||
|
||||
.log-container {
|
||||
height: 1fr;
|
||||
border: solid $primary;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.log-title {
|
||||
padding: 0 1;
|
||||
background: $primary-darken-2;
|
||||
}
|
||||
|
||||
ListView {
|
||||
height: 1fr;
|
||||
}
|
||||
|
||||
ListItem {
|
||||
padding: 0 1;
|
||||
}
|
||||
|
||||
ListItem:hover {
|
||||
background: $primary-darken-1;
|
||||
}
|
||||
|
||||
Log {
|
||||
height: 1fr;
|
||||
border: none;
|
||||
}
|
||||
|
||||
ProgressBar {
|
||||
width: 1fr;
|
||||
padding: 0 1;
|
||||
}
|
||||
"""
|
||||
|
||||
selected_task: reactive[str] = reactive("archive")
|
||||
sync_interval: reactive[int] = reactive(DEFAULT_SYNC_INTERVAL)
|
||||
next_sync_time: reactive[float] = reactive(0.0)
|
||||
|
||||
def __init__(self, sync_interval: int = DEFAULT_SYNC_INTERVAL):
|
||||
super().__init__()
|
||||
self._mounted: asyncio.Event = asyncio.Event()
|
||||
self._task_logs: Dict[str, List[str]] = {}
|
||||
self._task_items: Dict[str, TaskListItem] = {}
|
||||
self._sync_callback: Optional[Callable] = None
|
||||
self._countdown_task: Optional[asyncio.Task] = None
|
||||
self._spinner_task: Optional[asyncio.Task] = None
|
||||
self._initial_sync_interval = sync_interval
|
||||
|
||||
def compose(self) -> ComposeResult:
|
||||
"""Compose the dashboard layout."""
|
||||
yield Header()
|
||||
|
||||
with Horizontal(classes="dashboard"):
|
||||
# Sidebar with task list
|
||||
with Vertical(classes="sidebar"):
|
||||
yield Static("Tasks", classes="sidebar-title")
|
||||
yield ListView(
|
||||
# Stage 1: Sync local changes to server
|
||||
TaskListItem("archive", "Archive Mail", id="task-archive"),
|
||||
TaskListItem("outbox", "Outbox Send", id="task-outbox"),
|
||||
# Stage 2: Fetch from server
|
||||
TaskListItem("inbox", "Inbox Sync", id="task-inbox"),
|
||||
TaskListItem("calendar", "Calendar Sync", id="task-calendar"),
|
||||
# Stage 3: Task management
|
||||
TaskListItem("godspeed", "Godspeed Sync", id="task-godspeed"),
|
||||
TaskListItem("sweep", "Task Sweep", id="task-sweep"),
|
||||
id="task-list",
|
||||
)
|
||||
# Countdown timer at bottom of sidebar
|
||||
with Vertical(classes="countdown-container"):
|
||||
yield Static(
|
||||
"Next sync: --:--", id="countdown", classes="countdown-text"
|
||||
)
|
||||
|
||||
# Main panel with selected task details
|
||||
with Vertical(classes="main-panel"):
|
||||
# Task header with name and progress
|
||||
with Vertical(classes="task-header"):
|
||||
yield Static(
|
||||
"Archive Mail", id="selected-task-name", classes="task-name"
|
||||
)
|
||||
with Horizontal(classes="progress-row"):
|
||||
yield Static("Progress:", id="progress-label")
|
||||
yield ProgressBar(total=100, id="task-progress")
|
||||
yield Static("0%", id="progress-percent")
|
||||
|
||||
# Log for selected task
|
||||
with Vertical(classes="log-container"):
|
||||
yield Static("Activity Log", classes="log-title")
|
||||
yield Log(id="task-log")
|
||||
|
||||
yield Footer()
|
||||
|
||||
def on_mount(self) -> None:
|
||||
"""Initialize the dashboard."""
|
||||
# Store references to task items
|
||||
task_list = self.query_one("#task-list", ListView)
|
||||
for item in task_list.children:
|
||||
if isinstance(item, TaskListItem):
|
||||
self._task_items[item.task_id] = item
|
||||
self._task_logs[item.task_id] = []
|
||||
|
||||
# Initialize sync interval
|
||||
self.sync_interval = self._initial_sync_interval
|
||||
self.schedule_next_sync()
|
||||
|
||||
# Start countdown timer and spinner animation
|
||||
self._countdown_task = asyncio.create_task(self._update_countdown())
|
||||
self._spinner_task = asyncio.create_task(self._animate_spinners())
|
||||
|
||||
self._log_to_task("archive", "Dashboard initialized. Waiting to start sync...")
|
||||
self._mounted.set()
|
||||
|
||||
def on_list_view_selected(self, event: ListView.Selected) -> None:
|
||||
"""Handle task selection from the list."""
|
||||
if isinstance(event.item, TaskListItem):
|
||||
self.selected_task = event.item.task_id
|
||||
self._update_main_panel()
|
||||
|
||||
def on_list_view_highlighted(self, event: ListView.Highlighted) -> None:
|
||||
"""Handle task highlight from the list."""
|
||||
if isinstance(event.item, TaskListItem):
|
||||
self.selected_task = event.item.task_id
|
||||
self._update_main_panel()
|
||||
|
||||
def _update_main_panel(self) -> None:
|
||||
"""Update the main panel to show selected task details."""
|
||||
task_item = self._task_items.get(self.selected_task)
|
||||
if not task_item:
|
||||
return
|
||||
|
||||
# Update task name
|
||||
try:
|
||||
name_widget = self.query_one("#selected-task-name", Static)
|
||||
name_widget.update(Text(task_item.task_name, style="bold"))
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Update progress bar
|
||||
try:
|
||||
progress_bar = self.query_one("#task-progress", ProgressBar)
|
||||
progress_bar.total = task_item.total
|
||||
progress_bar.progress = task_item.progress
|
||||
|
||||
percent_widget = self.query_one("#progress-percent", Static)
|
||||
pct = (
|
||||
int((task_item.progress / task_item.total) * 100)
|
||||
if task_item.total > 0
|
||||
else 0
|
||||
)
|
||||
percent_widget.update(f"{pct}%")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Update log with task-specific logs
|
||||
try:
|
||||
log_widget = self.query_one("#task-log", Log)
|
||||
log_widget.clear()
|
||||
for entry in self._task_logs.get(self.selected_task, []):
|
||||
log_widget.write_line(entry)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def _log_to_task(self, task_id: str, message: str, level: str = "INFO") -> None:
|
||||
"""Add a log entry to a specific task."""
|
||||
timestamp = datetime.now().strftime("%H:%M:%S")
|
||||
formatted = f"[{timestamp}] {level}: {message}"
|
||||
|
||||
if task_id not in self._task_logs:
|
||||
self._task_logs[task_id] = []
|
||||
self._task_logs[task_id].append(formatted)
|
||||
|
||||
# If this is the selected task, also write to the visible log
|
||||
if task_id == self.selected_task:
|
||||
try:
|
||||
log_widget = self.query_one("#task-log", Log)
|
||||
log_widget.write_line(formatted)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def start_task(self, task_id: str, total: int = 100) -> None:
|
||||
"""Start a task."""
|
||||
if task_id in self._task_items:
|
||||
item = self._task_items[task_id]
|
||||
item.status = TaskStatus.RUNNING
|
||||
item.progress = 0
|
||||
item.total = total
|
||||
item.update_display()
|
||||
self._log_to_task(task_id, f"Starting {item.task_name}...")
|
||||
if task_id == self.selected_task:
|
||||
self._update_main_panel()
|
||||
|
||||
def update_task(self, task_id: str, progress: int, message: str = "") -> None:
|
||||
"""Update task progress."""
|
||||
if task_id in self._task_items:
|
||||
item = self._task_items[task_id]
|
||||
item.progress = progress
|
||||
item.update_display()
|
||||
if message:
|
||||
self._log_to_task(task_id, message)
|
||||
if task_id == self.selected_task:
|
||||
self._update_main_panel()
|
||||
|
||||
def complete_task(self, task_id: str, message: str = "") -> None:
|
||||
"""Mark a task as complete."""
|
||||
if task_id in self._task_items:
|
||||
item = self._task_items[task_id]
|
||||
item.status = TaskStatus.COMPLETED
|
||||
item.progress = item.total
|
||||
item.update_display()
|
||||
self._log_to_task(
|
||||
task_id,
|
||||
f"Completed: {message}" if message else "Completed successfully",
|
||||
)
|
||||
if task_id == self.selected_task:
|
||||
self._update_main_panel()
|
||||
|
||||
def error_task(self, task_id: str, error: str) -> None:
|
||||
"""Mark a task as errored."""
|
||||
if task_id in self._task_items:
|
||||
item = self._task_items[task_id]
|
||||
item.status = TaskStatus.ERROR
|
||||
item.update_display()
|
||||
self._log_to_task(task_id, f"ERROR: {error}", "ERROR")
|
||||
if task_id == self.selected_task:
|
||||
self._update_main_panel()
|
||||
|
||||
def skip_task(self, task_id: str, reason: str = "") -> None:
|
||||
"""Mark a task as skipped (completed with no work)."""
|
||||
if task_id in self._task_items:
|
||||
item = self._task_items[task_id]
|
||||
item.status = TaskStatus.COMPLETED
|
||||
item.update_display()
|
||||
self._log_to_task(task_id, f"Skipped: {reason}" if reason else "Skipped")
|
||||
if task_id == self.selected_task:
|
||||
self._update_main_panel()
|
||||
|
||||
def action_refresh(self) -> None:
|
||||
"""Refresh the dashboard."""
|
||||
self._update_main_panel()
|
||||
|
||||
def action_cursor_up(self) -> None:
|
||||
"""Move cursor up in task list."""
|
||||
task_list = self.query_one("#task-list", ListView)
|
||||
task_list.action_cursor_up()
|
||||
|
||||
def action_cursor_down(self) -> None:
|
||||
"""Move cursor down in task list."""
|
||||
task_list = self.query_one("#task-list", ListView)
|
||||
task_list.action_cursor_down()
|
||||
|
||||
def action_sync_now(self) -> None:
|
||||
"""Trigger an immediate sync."""
|
||||
if self._sync_callback:
|
||||
asyncio.create_task(self._run_sync_callback())
|
||||
else:
|
||||
self._log_to_task("archive", "No sync callback configured")
|
||||
|
||||
async def _run_sync_callback(self) -> None:
|
||||
"""Run the sync callback if set."""
|
||||
if self._sync_callback:
|
||||
if asyncio.iscoroutinefunction(self._sync_callback):
|
||||
await self._sync_callback()
|
||||
else:
|
||||
self._sync_callback()
|
||||
|
||||
def action_increase_interval(self) -> None:
|
||||
"""Increase sync interval by 1 minute."""
|
||||
self.sync_interval = min(self.sync_interval + 60, 3600) # Max 1 hour
|
||||
self._update_countdown_display()
|
||||
self._log_to_task(
|
||||
self.selected_task,
|
||||
f"Sync interval: {self.sync_interval // 60} min",
|
||||
)
|
||||
|
||||
def action_decrease_interval(self) -> None:
|
||||
"""Decrease sync interval by 1 minute."""
|
||||
self.sync_interval = max(self.sync_interval - 60, 60) # Min 1 minute
|
||||
self._update_countdown_display()
|
||||
self._log_to_task(
|
||||
self.selected_task,
|
||||
f"Sync interval: {self.sync_interval // 60} min",
|
||||
)
|
||||
|
||||
def set_sync_callback(self, callback: Callable) -> None:
|
||||
"""Set the callback to run when sync is triggered."""
|
||||
self._sync_callback = callback
|
||||
|
||||
def schedule_next_sync(self) -> None:
|
||||
"""Schedule the next sync time."""
|
||||
import time
|
||||
|
||||
self.next_sync_time = time.time() + self.sync_interval
|
||||
|
||||
def reset_all_tasks(self) -> None:
|
||||
"""Reset all tasks to pending state."""
|
||||
for task_id, item in self._task_items.items():
|
||||
item.status = TaskStatus.PENDING
|
||||
item.progress = 0
|
||||
item.update_display()
|
||||
self._update_main_panel()
|
||||
|
||||
async def _update_countdown(self) -> None:
|
||||
"""Update the countdown timer every second."""
|
||||
import time
|
||||
|
||||
while True:
|
||||
try:
|
||||
self._update_countdown_display()
|
||||
await asyncio.sleep(1)
|
||||
except asyncio.CancelledError:
|
||||
break
|
||||
except Exception:
|
||||
await asyncio.sleep(1)
|
||||
|
||||
def _update_countdown_display(self) -> None:
|
||||
"""Update the countdown display widget."""
|
||||
import time
|
||||
|
||||
try:
|
||||
countdown_widget = self.query_one("#countdown", Static)
|
||||
remaining = max(0, self.next_sync_time - time.time())
|
||||
|
||||
if remaining <= 0:
|
||||
countdown_widget.update(f"Syncing... ({self.sync_interval // 60}m)")
|
||||
else:
|
||||
minutes = int(remaining // 60)
|
||||
seconds = int(remaining % 60)
|
||||
countdown_widget.update(
|
||||
f"Next: {minutes:02d}:{seconds:02d} ({self.sync_interval // 60}m)"
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
async def _animate_spinners(self) -> None:
|
||||
"""Animate spinners for running tasks."""
|
||||
while True:
|
||||
try:
|
||||
# Update all running task spinners
|
||||
for task_id, item in self._task_items.items():
|
||||
if item.status == TaskStatus.RUNNING:
|
||||
item.advance_spinner()
|
||||
item.update_display()
|
||||
await asyncio.sleep(0.08) # ~12 FPS for smooth animation
|
||||
except asyncio.CancelledError:
|
||||
break
|
||||
except Exception:
|
||||
await asyncio.sleep(0.08)
|
||||
|
||||
|
||||
class SyncProgressTracker:
|
||||
"""Track sync progress and update the dashboard."""
|
||||
|
||||
def __init__(self, dashboard: SyncDashboard):
|
||||
self.dashboard = dashboard
|
||||
|
||||
def start_task(self, task_id: str, total: int = 100) -> None:
|
||||
"""Start tracking a task."""
|
||||
self.dashboard.start_task(task_id, total)
|
||||
|
||||
def update_task(self, task_id: str, progress: int, message: str = "") -> None:
|
||||
"""Update task progress."""
|
||||
self.dashboard.update_task(task_id, progress, message)
|
||||
|
||||
def complete_task(self, task_id: str, message: str = "") -> None:
|
||||
"""Mark a task as complete."""
|
||||
self.dashboard.complete_task(task_id, message)
|
||||
|
||||
def error_task(self, task_id: str, error: str) -> None:
|
||||
"""Mark a task as failed."""
|
||||
self.dashboard.error_task(task_id, error)
|
||||
|
||||
def skip_task(self, task_id: str, reason: str = "") -> None:
|
||||
"""Mark a task as skipped."""
|
||||
self.dashboard.skip_task(task_id, reason)
|
||||
|
||||
|
||||
# Global dashboard instance
|
||||
_dashboard_instance: Optional[SyncDashboard] = None
|
||||
_progress_tracker: Optional[SyncProgressTracker] = None
|
||||
|
||||
|
||||
def get_dashboard() -> Optional[SyncDashboard]:
|
||||
"""Get the global dashboard instance."""
|
||||
global _dashboard_instance
|
||||
return _dashboard_instance
|
||||
|
||||
|
||||
def get_progress_tracker() -> Optional[SyncProgressTracker]:
|
||||
"""Get the global progress_tracker"""
|
||||
global _progress_tracker
|
||||
return _progress_tracker
|
||||
|
||||
|
||||
async def run_dashboard_sync():
|
||||
"""Run sync with dashboard UI."""
|
||||
global _dashboard_instance, _progress_tracker
|
||||
|
||||
dashboard = SyncDashboard()
|
||||
tracker = SyncProgressTracker(dashboard)
|
||||
|
||||
_dashboard_instance = dashboard
|
||||
_progress_tracker = tracker
|
||||
|
||||
async def do_sync():
|
||||
"""Run the actual sync process."""
|
||||
try:
|
||||
# Reset all tasks before starting
|
||||
dashboard.reset_all_tasks()
|
||||
|
||||
# Simulate sync progress for demo (replace with actual sync calls)
|
||||
|
||||
# Stage 1: Sync local changes to server
|
||||
|
||||
# Archive mail
|
||||
tracker.start_task("archive", 100)
|
||||
tracker.update_task("archive", 50, "Scanning for archived messages...")
|
||||
await asyncio.sleep(0.3)
|
||||
tracker.update_task("archive", 100, "Moving 3 messages to archive...")
|
||||
await asyncio.sleep(0.2)
|
||||
tracker.complete_task("archive", "3 messages archived")
|
||||
|
||||
# Outbox
|
||||
tracker.start_task("outbox", 100)
|
||||
tracker.update_task("outbox", 50, "Checking outbox...")
|
||||
await asyncio.sleep(0.2)
|
||||
tracker.complete_task("outbox", "No pending emails")
|
||||
|
||||
# Stage 2: Fetch from server
|
||||
|
||||
# Inbox sync
|
||||
tracker.start_task("inbox", 100)
|
||||
for i in range(0, 101, 20):
|
||||
tracker.update_task("inbox", i, f"Fetching emails... {i}%")
|
||||
await asyncio.sleep(0.3)
|
||||
tracker.complete_task("inbox", "150 emails processed")
|
||||
|
||||
# Calendar sync
|
||||
tracker.start_task("calendar", 100)
|
||||
for i in range(0, 101, 25):
|
||||
tracker.update_task("calendar", i, f"Syncing events... {i}%")
|
||||
await asyncio.sleep(0.3)
|
||||
tracker.complete_task("calendar", "25 events synced")
|
||||
|
||||
# Stage 3: Task management
|
||||
|
||||
# Godspeed sync
|
||||
tracker.start_task("godspeed", 100)
|
||||
for i in range(0, 101, 33):
|
||||
tracker.update_task(
|
||||
"godspeed", min(i, 100), f"Syncing tasks... {min(i, 100)}%"
|
||||
)
|
||||
await asyncio.sleep(0.3)
|
||||
tracker.complete_task("godspeed", "42 tasks synced")
|
||||
|
||||
# Task sweep
|
||||
tracker.start_task("sweep")
|
||||
tracker.update_task("sweep", 50, "Scanning notes directory...")
|
||||
await asyncio.sleep(0.2)
|
||||
tracker.skip_task("sweep", "Before 6 PM, skipping daily sweep")
|
||||
|
||||
# Schedule next sync
|
||||
dashboard.schedule_next_sync()
|
||||
|
||||
except Exception as e:
|
||||
tracker.error_task("archive", str(e))
|
||||
|
||||
# Set the sync callback so 's' key triggers it
|
||||
dashboard.set_sync_callback(do_sync)
|
||||
|
||||
async def sync_loop():
|
||||
"""Run sync on interval."""
|
||||
import time
|
||||
|
||||
# Wait for the dashboard to be mounted before updating widgets
|
||||
await dashboard._mounted.wait()
|
||||
|
||||
# Run initial sync
|
||||
await do_sync()
|
||||
|
||||
# Then loop waiting for next sync time
|
||||
while True:
|
||||
try:
|
||||
remaining = dashboard.next_sync_time - time.time()
|
||||
if remaining <= 0:
|
||||
await do_sync()
|
||||
else:
|
||||
await asyncio.sleep(1)
|
||||
except asyncio.CancelledError:
|
||||
break
|
||||
except Exception:
|
||||
await asyncio.sleep(1)
|
||||
|
||||
# Run dashboard and sync loop concurrently
|
||||
await asyncio.gather(dashboard.run_async(), sync_loop())
|
||||
607
src/cli/ticktick.py
Normal file
607
src/cli/ticktick.py
Normal file
@@ -0,0 +1,607 @@
|
||||
"""
|
||||
TickTick CLI commands with aliases for task management.
|
||||
"""
|
||||
|
||||
import click
|
||||
from datetime import datetime
|
||||
from typing import Optional, List
|
||||
from rich.console import Console
|
||||
|
||||
from src.services.ticktick import TickTickService
|
||||
from src.utils.ticktick_utils import (
|
||||
create_task_table,
|
||||
print_task_details,
|
||||
open_task,
|
||||
parse_priority,
|
||||
validate_date,
|
||||
console,
|
||||
)
|
||||
|
||||
|
||||
# Initialize service lazily
|
||||
def get_ticktick_service():
|
||||
"""Get the TickTick service, initializing it if needed."""
|
||||
global _ticktick_service
|
||||
if "_ticktick_service" not in globals():
|
||||
_ticktick_service = TickTickService()
|
||||
return _ticktick_service
|
||||
|
||||
|
||||
@click.group()
|
||||
def ticktick():
|
||||
"""TickTick task management CLI."""
|
||||
pass
|
||||
|
||||
|
||||
@ticktick.command(name="list")
|
||||
@click.option("--project", "-p", help="Filter by project name")
|
||||
@click.option(
|
||||
"--due-date", "-d", help="Filter by due date (today, tomorrow, YYYY-MM-DD)"
|
||||
)
|
||||
@click.option("--all", "-a", is_flag=True, help="Show all tasks including completed")
|
||||
@click.option("--priority", "-pr", help="Filter by priority (0-5, low, medium, high)")
|
||||
@click.option("--tag", "-t", help="Filter by tag name")
|
||||
@click.option("--limit", "-l", default=20, help="Limit number of results")
|
||||
def list_tasks(
|
||||
project: Optional[str],
|
||||
due_date: Optional[str],
|
||||
all: bool,
|
||||
priority: Optional[str],
|
||||
tag: Optional[str],
|
||||
limit: int,
|
||||
):
|
||||
"""List tasks (alias: ls)."""
|
||||
try:
|
||||
ticktick_service = get_ticktick_service()
|
||||
|
||||
if due_date:
|
||||
if not validate_date(due_date):
|
||||
console.print(f"[red]Invalid date format: {due_date}[/red]")
|
||||
return
|
||||
tasks = get_ticktick_service().get_tasks_by_due_date(due_date)
|
||||
elif project:
|
||||
tasks = get_ticktick_service().get_tasks_by_project(project)
|
||||
else:
|
||||
tasks = get_ticktick_service().get_tasks(completed=all)
|
||||
|
||||
# Apply additional filters
|
||||
if priority:
|
||||
priority_val = parse_priority(priority)
|
||||
tasks = [t for t in tasks if t.get("priority", 0) == priority_val]
|
||||
|
||||
if tag:
|
||||
tasks = [
|
||||
t
|
||||
for t in tasks
|
||||
if tag.lower() in [t.lower() for t in t.get("tags", [])]
|
||||
]
|
||||
|
||||
# Limit results
|
||||
if limit > 0:
|
||||
tasks = tasks[:limit]
|
||||
|
||||
if not tasks:
|
||||
console.print("[yellow]No tasks found matching criteria[/yellow]")
|
||||
return
|
||||
|
||||
# Display results
|
||||
table = create_task_table(tasks, show_project=not project)
|
||||
console.print(table)
|
||||
console.print(f"\n[dim]Showing {len(tasks)} tasks[/dim]")
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error listing tasks: {str(e)}[/red]")
|
||||
|
||||
|
||||
@ticktick.command(name="add")
|
||||
@click.argument("title")
|
||||
@click.option("--project", "-p", help="Project name")
|
||||
@click.option("--due-date", "-d", help="Due date (today, tomorrow, YYYY-MM-DD)")
|
||||
@click.option("--priority", "-pr", help="Priority (0-5, low, medium, high)")
|
||||
@click.option("--content", "-c", help="Task description/content")
|
||||
@click.option("--tags", "-t", help="Comma-separated list of tags")
|
||||
def add_task(
|
||||
title: str,
|
||||
project: Optional[str],
|
||||
due_date: Optional[str],
|
||||
priority: Optional[str],
|
||||
content: Optional[str],
|
||||
tags: Optional[str],
|
||||
):
|
||||
"""Add a new task (alias: a)."""
|
||||
try:
|
||||
# Validate due date if provided
|
||||
if due_date and not validate_date(due_date):
|
||||
console.print(f"[red]Invalid date format: {due_date}[/red]")
|
||||
return
|
||||
|
||||
# Parse priority
|
||||
priority_val = parse_priority(priority) if priority else None
|
||||
|
||||
# Parse tags
|
||||
tag_list = [tag.strip() for tag in tags.split(",")] if tags else None
|
||||
|
||||
# Create task
|
||||
task = get_ticktick_service().create_task(
|
||||
title=title,
|
||||
project_name=project,
|
||||
due_date=due_date,
|
||||
priority=priority_val,
|
||||
content=content,
|
||||
tags=tag_list,
|
||||
)
|
||||
|
||||
if task:
|
||||
console.print(f"[green]✓ Created task: {title}[/green]")
|
||||
console.print(f"[dim]Task ID: {task.get('id', 'N/A')}[/dim]")
|
||||
else:
|
||||
console.print("[red]Failed to create task[/red]")
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error creating task: {str(e)}[/red]")
|
||||
|
||||
|
||||
@ticktick.command(name="edit")
|
||||
@click.argument("task_id")
|
||||
@click.option("--title", help="New task title")
|
||||
@click.option("--project", "-p", help="New project name")
|
||||
@click.option("--due-date", "-d", help="New due date (today, tomorrow, YYYY-MM-DD)")
|
||||
@click.option("--priority", "-pr", help="New priority (0-5, low, medium, high)")
|
||||
@click.option("--content", "-c", help="New task description/content")
|
||||
def edit_task(
|
||||
task_id: str,
|
||||
title: Optional[str],
|
||||
project: Optional[str],
|
||||
due_date: Optional[str],
|
||||
priority: Optional[str],
|
||||
content: Optional[str],
|
||||
):
|
||||
"""Edit an existing task (alias: e)."""
|
||||
try:
|
||||
# Build update dictionary
|
||||
updates = {}
|
||||
|
||||
if title:
|
||||
updates["title"] = title
|
||||
if project:
|
||||
updates["project_name"] = project
|
||||
if due_date:
|
||||
if not validate_date(due_date):
|
||||
console.print(f"[red]Invalid date format: {due_date}[/red]")
|
||||
return
|
||||
updates["due_date"] = due_date
|
||||
if priority:
|
||||
updates["priority"] = parse_priority(priority)
|
||||
if content:
|
||||
updates["content"] = content
|
||||
|
||||
if not updates:
|
||||
console.print("[yellow]No changes specified[/yellow]")
|
||||
return
|
||||
|
||||
# Update task
|
||||
updated_task = get_ticktick_service().update_task(task_id, **updates)
|
||||
|
||||
if updated_task:
|
||||
console.print(
|
||||
f"[green]✓ Updated task: {updated_task.get('title', task_id)}[/green]"
|
||||
)
|
||||
else:
|
||||
console.print("[red]Failed to update task[/red]")
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error updating task: {str(e)}[/red]")
|
||||
|
||||
|
||||
@ticktick.command(name="complete")
|
||||
@click.argument("task_id")
|
||||
def complete_task(task_id: str):
|
||||
"""Mark a task as completed (aliases: done, c)."""
|
||||
try:
|
||||
success = get_ticktick_service().complete_task(task_id)
|
||||
|
||||
if success:
|
||||
console.print(f"[green]✓ Completed task: {task_id}[/green]")
|
||||
else:
|
||||
console.print(f"[red]Failed to complete task: {task_id}[/red]")
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error completing task: {str(e)}[/red]")
|
||||
|
||||
|
||||
@ticktick.command(name="delete")
|
||||
@click.argument("task_id")
|
||||
@click.option("--force", "-f", is_flag=True, help="Skip confirmation prompt")
|
||||
def delete_task(task_id: str, force: bool):
|
||||
"""Delete a task (aliases: del, rm)."""
|
||||
try:
|
||||
if not force:
|
||||
if not click.confirm(f"Delete task {task_id}?"):
|
||||
console.print("[yellow]Cancelled[/yellow]")
|
||||
return
|
||||
|
||||
success = get_ticktick_service().delete_task(task_id)
|
||||
|
||||
if success:
|
||||
console.print(f"[green]✓ Deleted task: {task_id}[/green]")
|
||||
else:
|
||||
console.print(f"[red]Failed to delete task: {task_id}[/red]")
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error deleting task: {str(e)}[/red]")
|
||||
|
||||
|
||||
@ticktick.command(name="open")
|
||||
@click.argument("task_id")
|
||||
@click.option(
|
||||
"--browser", "-b", is_flag=True, help="Force open in browser instead of app"
|
||||
)
|
||||
def open_task_cmd(task_id: str, browser: bool):
|
||||
"""Open a task in browser or TickTick app (alias: o)."""
|
||||
try:
|
||||
open_task(task_id, prefer_app=not browser)
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error opening task: {str(e)}[/red]")
|
||||
|
||||
|
||||
@ticktick.command(name="show")
|
||||
@click.argument("task_id")
|
||||
def show_task(task_id: str):
|
||||
"""Show detailed task information (aliases: view, s)."""
|
||||
try:
|
||||
get_ticktick_service()._ensure_client()
|
||||
task = get_ticktick_service().client.get_by_id(task_id, search="tasks")
|
||||
|
||||
if not task:
|
||||
console.print(f"[red]Task not found: {task_id}[/red]")
|
||||
return
|
||||
|
||||
print_task_details(task)
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error showing task: {str(e)}[/red]")
|
||||
|
||||
|
||||
@ticktick.command(name="projects")
|
||||
def list_projects():
|
||||
"""List all projects (alias: proj)."""
|
||||
try:
|
||||
projects = get_ticktick_service().get_projects()
|
||||
|
||||
if not projects:
|
||||
console.print("[yellow]No projects found[/yellow]")
|
||||
return
|
||||
|
||||
console.print("[bold cyan]Projects:[/bold cyan]")
|
||||
for project in projects:
|
||||
name = project.get("name", "Unnamed")
|
||||
project_id = project.get("id", "N/A")
|
||||
console.print(f" • [white]{name}[/white] [dim]({project_id})[/dim]")
|
||||
|
||||
console.print(f"\n[dim]Total: {len(projects)} projects[/dim]")
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error listing projects: {str(e)}[/red]")
|
||||
|
||||
|
||||
@ticktick.command(name="tags")
|
||||
def list_tags():
|
||||
"""List all tags."""
|
||||
try:
|
||||
tags = get_ticktick_service().get_tags()
|
||||
|
||||
if not tags:
|
||||
console.print("[yellow]No tags found[/yellow]")
|
||||
return
|
||||
|
||||
console.print("[bold green]Tags:[/bold green]")
|
||||
for tag in tags:
|
||||
name = tag.get("name", "Unnamed")
|
||||
console.print(f" • [green]#{name}[/green]")
|
||||
|
||||
console.print(f"\n[dim]Total: {len(tags)} tags[/dim]")
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error listing tags: {str(e)}[/red]")
|
||||
|
||||
|
||||
@ticktick.command(name="sync")
|
||||
def sync_tasks():
|
||||
"""Sync tasks with TickTick servers."""
|
||||
try:
|
||||
get_ticktick_service().sync()
|
||||
console.print("[green]✓ Synced with TickTick servers[/green]")
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]Error syncing: {str(e)}[/red]")
|
||||
|
||||
|
||||
# Add alias commands manually
|
||||
@click.command()
|
||||
@click.option("--project", "-p", help="Filter by project name")
|
||||
@click.option(
|
||||
"--due-date", "-d", help="Filter by due date (today, tomorrow, YYYY-MM-DD)"
|
||||
)
|
||||
@click.option("--all", "-a", is_flag=True, help="Show all tasks including completed")
|
||||
@click.option("--priority", "-pr", help="Filter by priority (0-5, low, medium, high)")
|
||||
@click.option("--tag", "-t", help="Filter by tag name")
|
||||
@click.option("--limit", "-l", default=20, help="Limit number of results")
|
||||
def ls(
|
||||
project: Optional[str],
|
||||
due_date: Optional[str],
|
||||
all: bool,
|
||||
priority: Optional[str],
|
||||
tag: Optional[str],
|
||||
limit: int,
|
||||
):
|
||||
"""Alias for list command."""
|
||||
list_tasks.callback(project, due_date, all, priority, tag, limit)
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.argument("title")
|
||||
@click.option("--project", "-p", help="Project name")
|
||||
@click.option("--due-date", "-d", help="Due date (today, tomorrow, YYYY-MM-DD)")
|
||||
@click.option("--priority", "-pr", help="Priority (0-5, low, medium, high)")
|
||||
@click.option("--content", "-c", help="Task description/content")
|
||||
@click.option("--tags", "-t", help="Comma-separated list of tags")
|
||||
def a(
|
||||
title: str,
|
||||
project: Optional[str],
|
||||
due_date: Optional[str],
|
||||
priority: Optional[str],
|
||||
content: Optional[str],
|
||||
tags: Optional[str],
|
||||
):
|
||||
"""Alias for add command."""
|
||||
add_task.callback(title, project, due_date, priority, content, tags)
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.argument("task_id")
|
||||
@click.option("--title", help="New task title")
|
||||
@click.option("--project", "-p", help="New project name")
|
||||
@click.option("--due-date", "-d", help="New due date (today, tomorrow, YYYY-MM-DD)")
|
||||
@click.option("--priority", "-pr", help="New priority (0-5, low, medium, high)")
|
||||
@click.option("--content", "-c", help="New task description/content")
|
||||
def e(
|
||||
task_id: str,
|
||||
title: Optional[str],
|
||||
project: Optional[str],
|
||||
due_date: Optional[str],
|
||||
priority: Optional[str],
|
||||
content: Optional[str],
|
||||
):
|
||||
"""Alias for edit command."""
|
||||
edit_task.callback(task_id, title, project, due_date, priority, content)
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.argument("task_id")
|
||||
def c(task_id: str):
|
||||
"""Alias for complete command."""
|
||||
complete_task.callback(task_id)
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.argument("task_id")
|
||||
def done(task_id: str):
|
||||
"""Alias for complete command."""
|
||||
complete_task.callback(task_id)
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.argument("task_id")
|
||||
@click.option("--force", "-f", is_flag=True, help="Skip confirmation prompt")
|
||||
def rm(task_id: str, force: bool):
|
||||
"""Alias for delete command."""
|
||||
delete_task.callback(task_id, force)
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.argument("task_id")
|
||||
@click.option("--force", "-f", is_flag=True, help="Skip confirmation prompt")
|
||||
def del_cmd(task_id: str, force: bool):
|
||||
"""Alias for delete command."""
|
||||
delete_task.callback(task_id, force)
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.argument("task_id")
|
||||
@click.option(
|
||||
"--browser", "-b", is_flag=True, help="Force open in browser instead of app"
|
||||
)
|
||||
def o(task_id: str, browser: bool):
|
||||
"""Alias for open command."""
|
||||
open_task_cmd.callback(task_id, browser)
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.argument("task_id")
|
||||
def s(task_id: str):
|
||||
"""Alias for show command."""
|
||||
show_task.callback(task_id)
|
||||
|
||||
|
||||
@click.command()
|
||||
@click.argument("task_id")
|
||||
def view(task_id: str):
|
||||
"""Alias for show command."""
|
||||
show_task.callback(task_id)
|
||||
|
||||
|
||||
@click.command()
|
||||
def proj():
|
||||
"""Alias for projects command."""
|
||||
list_projects.callback()
|
||||
|
||||
|
||||
# Register all alias commands
|
||||
ticktick.add_command(ls)
|
||||
ticktick.add_command(a)
|
||||
ticktick.add_command(e)
|
||||
ticktick.add_command(c)
|
||||
ticktick.add_command(done)
|
||||
ticktick.add_command(rm)
|
||||
ticktick.add_command(del_cmd, name="del")
|
||||
ticktick.add_command(o)
|
||||
ticktick.add_command(s)
|
||||
ticktick.add_command(view)
|
||||
ticktick.add_command(proj)
|
||||
|
||||
|
||||
@ticktick.command(name="setup")
|
||||
def setup_ticktick():
|
||||
"""Show TickTick setup instructions."""
|
||||
from rich.panel import Panel
|
||||
from rich.markdown import Markdown
|
||||
|
||||
setup_text = """
|
||||
# TickTick Setup Instructions
|
||||
|
||||
## 1. Register TickTick Developer App
|
||||
Visit: https://developer.ticktick.com/docs#/openapi
|
||||
- Click "Manage Apps" → "+App Name"
|
||||
- Set OAuth Redirect URL: `http://localhost:8080`
|
||||
- Note your Client ID and Client Secret
|
||||
|
||||
## 2. Set Environment Variables
|
||||
```bash
|
||||
export TICKTICK_CLIENT_ID="your_client_id"
|
||||
export TICKTICK_CLIENT_SECRET="your_client_secret"
|
||||
export TICKTICK_REDIRECT_URI="http://localhost:8080"
|
||||
|
||||
# Optional (you'll be prompted if not set):
|
||||
export TICKTICK_USERNAME="your_email@example.com"
|
||||
export TICKTICK_PASSWORD="your_password"
|
||||
```
|
||||
|
||||
## 3. Authentication Note
|
||||
The TickTick library requires **both** OAuth2 AND login credentials:
|
||||
- OAuth2: For API authorization
|
||||
- Username/Password: For initial session setup
|
||||
|
||||
This is how the library works, not a limitation of our CLI.
|
||||
|
||||
## 4. Start Using
|
||||
```bash
|
||||
ticktick ls # List tasks
|
||||
ticktick a "Task" # Add task
|
||||
```
|
||||
"""
|
||||
|
||||
console.print(
|
||||
Panel(
|
||||
Markdown(setup_text),
|
||||
title="[bold green]TickTick Setup[/bold green]",
|
||||
border_style="green",
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
@ticktick.command(name="test-auth")
|
||||
def test_auth():
|
||||
"""Test authentication and API connectivity."""
|
||||
import os
|
||||
from src.services.ticktick.auth import (
|
||||
get_token_file_path,
|
||||
check_token_validity,
|
||||
get_ticktick_client,
|
||||
)
|
||||
|
||||
console.print("[bold cyan]TickTick Authentication Test[/bold cyan]\n")
|
||||
|
||||
# Check environment
|
||||
client_id = os.getenv("TICKTICK_CLIENT_ID")
|
||||
client_secret = os.getenv("TICKTICK_CLIENT_SECRET")
|
||||
|
||||
if not client_id or not client_secret:
|
||||
console.print("[red]❌ OAuth credentials not set[/red]")
|
||||
console.print("Please set TICKTICK_CLIENT_ID and TICKTICK_CLIENT_SECRET")
|
||||
return
|
||||
|
||||
console.print("[green]✓ OAuth credentials found[/green]")
|
||||
|
||||
# Check token cache
|
||||
validity = check_token_validity()
|
||||
if validity["valid"]:
|
||||
console.print(f"[green]✓ Token cache: {validity['reason']}[/green]")
|
||||
else:
|
||||
console.print(f"[yellow]⚠ Token cache: {validity['reason']}[/yellow]")
|
||||
|
||||
# Test client creation
|
||||
console.print("\n[bold]Testing TickTick client initialization...[/bold]")
|
||||
try:
|
||||
client = get_ticktick_client()
|
||||
console.print("[green]✓ TickTick client created successfully[/green]")
|
||||
|
||||
# Test API call
|
||||
console.print("Testing API connectivity...")
|
||||
try:
|
||||
projects = client.get_by_fields(search="projects")
|
||||
console.print(
|
||||
f"[green]✓ API test successful - found {len(projects)} projects[/green]"
|
||||
)
|
||||
except Exception as api_e:
|
||||
console.print(f"[red]❌ API test failed: {api_e}[/red]")
|
||||
|
||||
except Exception as e:
|
||||
console.print(f"[red]❌ Client creation failed: {str(e)}[/red]")
|
||||
return
|
||||
|
||||
console.print("\n[green]🎉 Authentication test completed successfully![/green]")
|
||||
|
||||
|
||||
@ticktick.command(name="auth-status")
|
||||
def auth_status():
|
||||
"""Check TickTick authentication status."""
|
||||
import os
|
||||
from src.services.ticktick.auth import get_token_file_path, check_token_validity
|
||||
|
||||
console.print("[bold cyan]TickTick Authentication Status[/bold cyan]\n")
|
||||
|
||||
# Check OAuth credentials
|
||||
client_id = os.getenv("TICKTICK_CLIENT_ID")
|
||||
client_secret = os.getenv("TICKTICK_CLIENT_SECRET")
|
||||
redirect_uri = os.getenv("TICKTICK_REDIRECT_URI")
|
||||
|
||||
console.print(f"OAuth Client ID: {'✓ Set' if client_id else '✗ Not set'}")
|
||||
console.print(f"OAuth Client Secret: {'✓ Set' if client_secret else '✗ Not set'}")
|
||||
console.print(
|
||||
f"OAuth Redirect URI: {redirect_uri or '✗ Not set (will use default)'}"
|
||||
)
|
||||
|
||||
# Check login credentials
|
||||
username = os.getenv("TICKTICK_USERNAME")
|
||||
password = os.getenv("TICKTICK_PASSWORD")
|
||||
|
||||
console.print(f"Username: {'✓ Set' if username else '✗ Not set (will prompt)'}")
|
||||
console.print(f"Password: {'✓ Set' if password else '✗ Not set (will prompt)'}")
|
||||
|
||||
# Check token cache with validity
|
||||
token_file = get_token_file_path()
|
||||
token_exists = token_file.exists()
|
||||
|
||||
if token_exists:
|
||||
validity = check_token_validity()
|
||||
if validity["valid"]:
|
||||
console.print("OAuth Token Cache: [green]✓ Valid[/green]")
|
||||
if "expires_in_hours" in validity:
|
||||
console.print(
|
||||
f"Token expires in: [yellow]{validity['expires_in_hours']} hours[/yellow]"
|
||||
)
|
||||
else:
|
||||
console.print(
|
||||
f"OAuth Token Cache: [red]✗ Invalid ({validity['reason']})[/red]"
|
||||
)
|
||||
|
||||
import datetime
|
||||
|
||||
mod_time = datetime.datetime.fromtimestamp(token_file.stat().st_mtime)
|
||||
console.print(f"Token file: {token_file}")
|
||||
console.print(f"Last modified: {mod_time.strftime('%Y-%m-%d %H:%M:%S')}")
|
||||
else:
|
||||
console.print("OAuth Token Cache: [red]✗ Not found[/red]")
|
||||
console.print(f"Token file: {token_file}")
|
||||
|
||||
console.print("\n[dim]Run 'ticktick setup' for setup instructions[/dim]")
|
||||
0
src/services/gitlab_monitor/__init__.py
Normal file
0
src/services/gitlab_monitor/__init__.py
Normal file
109
src/services/gitlab_monitor/config.py
Normal file
109
src/services/gitlab_monitor/config.py
Normal file
@@ -0,0 +1,109 @@
|
||||
import os
|
||||
import yaml
|
||||
from typing import Optional, Dict, Any
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
class GitLabMonitorConfig:
|
||||
"""Configuration management for GitLab pipeline monitoring daemon."""
|
||||
|
||||
def __init__(self, config_path: Optional[str] = None):
|
||||
self.config_path = config_path or os.path.expanduser(
|
||||
"~/.config/luk/gitlab_monitor.yaml"
|
||||
)
|
||||
self.config = self._load_config()
|
||||
|
||||
def _load_config(self) -> Dict[str, Any]:
|
||||
"""Load configuration from file or create default."""
|
||||
config_file = Path(self.config_path)
|
||||
|
||||
if config_file.exists():
|
||||
try:
|
||||
with open(config_file, "r") as f:
|
||||
return yaml.safe_load(f) or {}
|
||||
except Exception as e:
|
||||
print(f"Error loading config: {e}")
|
||||
return self._default_config()
|
||||
else:
|
||||
# Create default config file
|
||||
config = self._default_config()
|
||||
self._save_config(config)
|
||||
return config
|
||||
|
||||
def _default_config(self) -> Dict[str, Any]:
|
||||
"""Return default configuration."""
|
||||
return {
|
||||
"email_monitoring": {
|
||||
"subject_patterns": ["Failed pipeline", "Pipeline failed"],
|
||||
"sender_patterns": ["*@gitlab.com", "*gitlab*"],
|
||||
"check_interval": 30, # seconds
|
||||
},
|
||||
"gitlab": {
|
||||
"api_token": os.getenv("GITLAB_API_TOKEN", ""),
|
||||
"base_url": "https://gitlab.com",
|
||||
"default_project_id": None,
|
||||
},
|
||||
"openai": {
|
||||
"api_key": os.getenv("OPENAI_API_KEY", ""),
|
||||
"model": "gpt-4", # GPT-5 not available yet, using GPT-4
|
||||
"max_tokens": 1000,
|
||||
"temperature": 0.1,
|
||||
},
|
||||
"notifications": {
|
||||
"enabled": True,
|
||||
"sound": True,
|
||||
"show_summary_window": True,
|
||||
},
|
||||
"logging": {
|
||||
"level": "INFO",
|
||||
"log_file": os.path.expanduser("~/.local/share/luk/gitlab_monitor.log"),
|
||||
},
|
||||
}
|
||||
|
||||
def _save_config(self, config: Dict[str, Any]):
|
||||
"""Save configuration to file."""
|
||||
config_file = Path(self.config_path)
|
||||
config_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
with open(config_file, "w") as f:
|
||||
yaml.dump(config, f, default_flow_style=False)
|
||||
|
||||
def get_gitlab_token(self) -> Optional[str]:
|
||||
"""Get GitLab API token."""
|
||||
return self.config.get("gitlab", {}).get("api_token") or os.getenv(
|
||||
"GITLAB_API_TOKEN"
|
||||
)
|
||||
|
||||
def get_openai_key(self) -> Optional[str]:
|
||||
"""Get OpenAI API key."""
|
||||
return self.config.get("openai", {}).get("api_key") or os.getenv(
|
||||
"OPENAI_API_KEY"
|
||||
)
|
||||
|
||||
def get_subject_patterns(self) -> list:
|
||||
"""Get email subject patterns to monitor."""
|
||||
return self.config.get("email_monitoring", {}).get("subject_patterns", [])
|
||||
|
||||
def get_sender_patterns(self) -> list:
|
||||
"""Get sender patterns to monitor."""
|
||||
return self.config.get("email_monitoring", {}).get("sender_patterns", [])
|
||||
|
||||
def get_check_interval(self) -> int:
|
||||
"""Get email check interval in seconds."""
|
||||
return self.config.get("email_monitoring", {}).get("check_interval", 30)
|
||||
|
||||
def get_gitlab_base_url(self) -> str:
|
||||
"""Get GitLab base URL."""
|
||||
return self.config.get("gitlab", {}).get("base_url", "https://gitlab.com")
|
||||
|
||||
def get_openai_model(self) -> str:
|
||||
"""Get OpenAI model to use."""
|
||||
return self.config.get("openai", {}).get("model", "gpt-4")
|
||||
|
||||
def is_notifications_enabled(self) -> bool:
|
||||
"""Check if notifications are enabled."""
|
||||
return self.config.get("notifications", {}).get("enabled", True)
|
||||
|
||||
def save(self):
|
||||
"""Save current configuration to file."""
|
||||
self._save_config(self.config)
|
||||
250
src/services/gitlab_monitor/daemon.py
Normal file
250
src/services/gitlab_monitor/daemon.py
Normal file
@@ -0,0 +1,250 @@
|
||||
import asyncio
|
||||
import logging
|
||||
import fnmatch
|
||||
import re
|
||||
from typing import List, Dict, Any, Optional
|
||||
from datetime import datetime, timedelta
|
||||
import time
|
||||
import os
|
||||
import sys
|
||||
|
||||
# Add src to path for imports
|
||||
sys.path.append(os.path.join(os.path.dirname(__file__), "..", ".."))
|
||||
|
||||
from src.services.himalaya import client as himalaya_client
|
||||
from .config import GitLabMonitorConfig
|
||||
from .gitlab_client import GitLabClient
|
||||
from .openai_analyzer import OpenAIAnalyzer
|
||||
from .notifications import MacOSNotificationManager
|
||||
|
||||
|
||||
class GitLabPipelineMonitor:
|
||||
"""Daemon that monitors emails for GitLab pipeline failures and provides AI analysis."""
|
||||
|
||||
def __init__(self, config_path: Optional[str] = None):
|
||||
self.config = GitLabMonitorConfig(config_path)
|
||||
self.gitlab_client = None
|
||||
self.openai_analyzer = None
|
||||
self.notifications = MacOSNotificationManager()
|
||||
self.last_check_time = datetime.now()
|
||||
self.processed_emails = set() # Track processed email IDs
|
||||
|
||||
self._setup_logging()
|
||||
self._initialize_clients()
|
||||
|
||||
def _setup_logging(self):
|
||||
"""Configure logging."""
|
||||
log_level = getattr(
|
||||
logging, self.config.config.get("logging", {}).get("level", "INFO")
|
||||
)
|
||||
log_file = self.config.config.get("logging", {}).get("log_file")
|
||||
|
||||
logging.basicConfig(
|
||||
level=log_level,
|
||||
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
|
||||
handlers=[
|
||||
logging.StreamHandler(),
|
||||
logging.FileHandler(log_file) if log_file else logging.NullHandler(),
|
||||
],
|
||||
)
|
||||
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
def _initialize_clients(self):
|
||||
"""Initialize GitLab and OpenAI clients."""
|
||||
gitlab_token = self.config.get_gitlab_token()
|
||||
openai_key = self.config.get_openai_key()
|
||||
|
||||
if not gitlab_token:
|
||||
self.logger.warning(
|
||||
"GitLab API token not configured. Set GITLAB_API_TOKEN environment variable."
|
||||
)
|
||||
return
|
||||
|
||||
if not openai_key:
|
||||
self.logger.warning(
|
||||
"OpenAI API key not configured. Set OPENAI_API_KEY environment variable."
|
||||
)
|
||||
return
|
||||
|
||||
self.gitlab_client = GitLabClient(
|
||||
self.config.get_gitlab_base_url(), gitlab_token
|
||||
)
|
||||
|
||||
self.openai_analyzer = OpenAIAnalyzer(
|
||||
openai_key, self.config.get_openai_model()
|
||||
)
|
||||
|
||||
self.logger.info("GitLab Pipeline Monitor initialized successfully")
|
||||
|
||||
async def start_monitoring(self):
|
||||
"""Start the email monitoring daemon."""
|
||||
if not self.gitlab_client or not self.openai_analyzer:
|
||||
self.logger.error("Cannot start monitoring: missing API tokens")
|
||||
return
|
||||
|
||||
self.logger.info("Starting GitLab pipeline monitoring daemon...")
|
||||
|
||||
check_interval = self.config.get_check_interval()
|
||||
|
||||
while True:
|
||||
try:
|
||||
await self._check_for_pipeline_emails()
|
||||
await asyncio.sleep(check_interval)
|
||||
except KeyboardInterrupt:
|
||||
self.logger.info("Monitoring stopped by user")
|
||||
break
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error in monitoring loop: {e}")
|
||||
await asyncio.sleep(check_interval)
|
||||
|
||||
async def _check_for_pipeline_emails(self):
|
||||
"""Check for new GitLab pipeline failure emails."""
|
||||
try:
|
||||
# Get recent emails using the existing Himalaya client
|
||||
envelopes, success = await himalaya_client.list_envelopes(limit=50)
|
||||
|
||||
if not success or not envelopes:
|
||||
return
|
||||
|
||||
for envelope in envelopes:
|
||||
# Skip if we've already processed this email
|
||||
email_id = envelope.get("id")
|
||||
if email_id in self.processed_emails:
|
||||
continue
|
||||
|
||||
# Check if email matches our patterns
|
||||
if self._is_pipeline_failure_email(envelope):
|
||||
self.logger.info(f"Found pipeline failure email: {email_id}")
|
||||
await self._process_pipeline_failure_email(envelope)
|
||||
self.processed_emails.add(email_id)
|
||||
|
||||
# Limit the size of processed emails set
|
||||
if len(self.processed_emails) > 1000:
|
||||
# Keep only the most recent 500
|
||||
recent_emails = list(self.processed_emails)[-500:]
|
||||
self.processed_emails = set(recent_emails)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error checking emails: {e}")
|
||||
|
||||
def _is_pipeline_failure_email(self, envelope: Dict[str, Any]) -> bool:
|
||||
"""Check if email matches pipeline failure patterns."""
|
||||
subject = envelope.get("subject", "").lower()
|
||||
sender_addr = envelope.get("from", {}).get("addr", "").lower()
|
||||
|
||||
# Check subject patterns
|
||||
subject_patterns = self.config.get_subject_patterns()
|
||||
subject_match = any(pattern.lower() in subject for pattern in subject_patterns)
|
||||
|
||||
# Check sender patterns
|
||||
sender_patterns = self.config.get_sender_patterns()
|
||||
sender_match = any(
|
||||
fnmatch.fnmatch(sender_addr, pattern.lower()) for pattern in sender_patterns
|
||||
)
|
||||
|
||||
return subject_match and sender_match
|
||||
|
||||
async def _process_pipeline_failure_email(self, envelope: Dict[str, Any]):
|
||||
"""Process a pipeline failure email."""
|
||||
try:
|
||||
# Get email content
|
||||
email_id = envelope.get("id")
|
||||
content, success = await himalaya_client.get_message_content(email_id)
|
||||
|
||||
if not success or not content:
|
||||
self.logger.error(f"Failed to get content for email {email_id}")
|
||||
return
|
||||
|
||||
# Extract GitLab project and pipeline information
|
||||
project_info = self.gitlab_client.extract_project_info_from_email(content)
|
||||
|
||||
if not project_info:
|
||||
self.logger.warning(
|
||||
f"Could not extract GitLab info from email {email_id}"
|
||||
)
|
||||
return
|
||||
|
||||
project_path = project_info.get("project_path")
|
||||
pipeline_id = project_info.get("pipeline_id")
|
||||
|
||||
if not project_path or not pipeline_id:
|
||||
self.logger.warning(
|
||||
f"Missing project path or pipeline ID in email {email_id}"
|
||||
)
|
||||
return
|
||||
|
||||
# Get GitLab project
|
||||
project = self.gitlab_client.get_project_by_path(project_path)
|
||||
if not project:
|
||||
self.logger.error(f"Could not find GitLab project: {project_path}")
|
||||
return
|
||||
|
||||
project_id = project["id"]
|
||||
project_name = project["name"]
|
||||
|
||||
# Get failed jobs with traces
|
||||
failed_jobs = self.gitlab_client.get_failed_jobs_with_traces(
|
||||
project_id, pipeline_id
|
||||
)
|
||||
|
||||
if not failed_jobs:
|
||||
self.logger.info(f"No failed jobs found for pipeline {pipeline_id}")
|
||||
return
|
||||
|
||||
# Send initial notification
|
||||
if self.config.is_notifications_enabled():
|
||||
self.notifications.send_pipeline_failure_notification(
|
||||
project_name, pipeline_id, len(failed_jobs)
|
||||
)
|
||||
|
||||
# Analyze failures with OpenAI
|
||||
analysis = self.openai_analyzer.analyze_pipeline_failures(failed_jobs)
|
||||
|
||||
if analysis:
|
||||
self.logger.info(f"Analysis completed for pipeline {pipeline_id}")
|
||||
|
||||
# Show analysis window
|
||||
if self.config.is_notifications_enabled():
|
||||
self.notifications.show_failure_analysis(
|
||||
project_name, pipeline_id, analysis
|
||||
)
|
||||
else:
|
||||
self.logger.error(f"Failed to analyze pipeline {pipeline_id}")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error processing pipeline failure email: {e}")
|
||||
|
||||
|
||||
async def main():
|
||||
"""Main entry point for the daemon."""
|
||||
import argparse
|
||||
|
||||
parser = argparse.ArgumentParser(description="GitLab Pipeline Monitoring Daemon")
|
||||
parser.add_argument("--config", help="Path to configuration file")
|
||||
parser.add_argument(
|
||||
"--test", action="store_true", help="Test configuration and exit"
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
monitor = GitLabPipelineMonitor(args.config)
|
||||
|
||||
if args.test:
|
||||
print("Configuration test:")
|
||||
print(
|
||||
f"GitLab token configured: {'Yes' if monitor.config.get_gitlab_token() else 'No'}"
|
||||
)
|
||||
print(
|
||||
f"OpenAI key configured: {'Yes' if monitor.config.get_openai_key() else 'No'}"
|
||||
)
|
||||
print(f"Subject patterns: {monitor.config.get_subject_patterns()}")
|
||||
print(f"Sender patterns: {monitor.config.get_sender_patterns()}")
|
||||
print(f"Check interval: {monitor.config.get_check_interval()}s")
|
||||
return
|
||||
|
||||
await monitor.start_monitoring()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
106
src/services/gitlab_monitor/gitlab_client.py
Normal file
106
src/services/gitlab_monitor/gitlab_client.py
Normal file
@@ -0,0 +1,106 @@
|
||||
import requests
|
||||
import logging
|
||||
from typing import Optional, Dict, Any, List
|
||||
from urllib.parse import urljoin
|
||||
|
||||
|
||||
class GitLabClient:
|
||||
"""Client for interacting with GitLab CI API."""
|
||||
|
||||
def __init__(self, base_url: str, api_token: str):
|
||||
self.base_url = base_url.rstrip("/")
|
||||
self.api_token = api_token
|
||||
self.session = requests.Session()
|
||||
self.session.headers.update(
|
||||
{"Authorization": f"Bearer {api_token}", "Content-Type": "application/json"}
|
||||
)
|
||||
|
||||
def _make_request(
|
||||
self, endpoint: str, method: str = "GET", **kwargs
|
||||
) -> Optional[Dict]:
|
||||
"""Make API request to GitLab."""
|
||||
url = urljoin(f"{self.base_url}/api/v4/", endpoint)
|
||||
|
||||
try:
|
||||
response = self.session.request(method, url, **kwargs)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except requests.exceptions.RequestException as e:
|
||||
logging.error(f"GitLab API request failed: {e}")
|
||||
return None
|
||||
|
||||
def extract_project_info_from_email(
|
||||
self, email_content: str
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Extract project ID and pipeline ID from GitLab notification email."""
|
||||
import re
|
||||
|
||||
# Common patterns in GitLab emails
|
||||
patterns = {
|
||||
"project_url": r"https?://[^/]+/([^/]+/[^/]+)/-/pipelines/(\d+)",
|
||||
"pipeline_id": r"Pipeline #(\d+)",
|
||||
"project_name": r"Project:\s*([^\n]+)",
|
||||
"pipeline_url": r"(https?://[^/]+/[^/]+/[^/]+/-/pipelines/\d+)",
|
||||
}
|
||||
|
||||
extracted = {}
|
||||
|
||||
for key, pattern in patterns.items():
|
||||
match = re.search(pattern, email_content)
|
||||
if match:
|
||||
if key == "project_url":
|
||||
extracted["project_path"] = match.group(1)
|
||||
extracted["pipeline_id"] = int(match.group(2))
|
||||
elif key == "pipeline_id":
|
||||
extracted["pipeline_id"] = int(match.group(1))
|
||||
elif key == "project_name":
|
||||
extracted["project_name"] = match.group(1).strip()
|
||||
elif key == "pipeline_url":
|
||||
extracted["pipeline_url"] = match.group(1)
|
||||
|
||||
return extracted if extracted else None
|
||||
|
||||
def get_project_by_path(self, project_path: str) -> Optional[Dict]:
|
||||
"""Get project information by path (namespace/project)."""
|
||||
encoded_path = project_path.replace("/", "%2F")
|
||||
return self._make_request(f"projects/{encoded_path}")
|
||||
|
||||
def get_pipeline(self, project_id: int, pipeline_id: int) -> Optional[Dict]:
|
||||
"""Get pipeline information."""
|
||||
return self._make_request(f"projects/{project_id}/pipelines/{pipeline_id}")
|
||||
|
||||
def get_pipeline_jobs(
|
||||
self, project_id: int, pipeline_id: int
|
||||
) -> Optional[List[Dict]]:
|
||||
"""Get jobs for a pipeline."""
|
||||
return self._make_request(f"projects/{project_id}/pipelines/{pipeline_id}/jobs")
|
||||
|
||||
def get_job_trace(self, project_id: int, job_id: int) -> Optional[str]:
|
||||
"""Get trace log for a specific job."""
|
||||
url = urljoin(
|
||||
f"{self.base_url}/api/v4/", f"projects/{project_id}/jobs/{job_id}/trace"
|
||||
)
|
||||
|
||||
try:
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.text
|
||||
except requests.exceptions.RequestException as e:
|
||||
logging.error(f"Failed to get job trace: {e}")
|
||||
return None
|
||||
|
||||
def get_failed_jobs_with_traces(
|
||||
self, project_id: int, pipeline_id: int
|
||||
) -> List[Dict]:
|
||||
"""Get all failed jobs with their trace logs."""
|
||||
jobs = self.get_pipeline_jobs(project_id, pipeline_id)
|
||||
if not jobs:
|
||||
return []
|
||||
|
||||
failed_jobs = [job for job in jobs if job.get("status") == "failed"]
|
||||
|
||||
for job in failed_jobs:
|
||||
trace = self.get_job_trace(project_id, job["id"])
|
||||
job["trace"] = trace
|
||||
|
||||
return failed_jobs
|
||||
69
src/services/gitlab_monitor/notifications.py
Normal file
69
src/services/gitlab_monitor/notifications.py
Normal file
@@ -0,0 +1,69 @@
|
||||
import subprocess
|
||||
import logging
|
||||
from typing import Optional
|
||||
|
||||
|
||||
class MacOSNotificationManager:
|
||||
"""Manager for macOS notifications and display windows."""
|
||||
|
||||
def __init__(self):
|
||||
self.app_name = "GitLab Pipeline Monitor"
|
||||
|
||||
def send_notification(
|
||||
self, title: str, message: str, sound: bool = True, action_button: bool = True
|
||||
) -> bool:
|
||||
"""Send macOS notification with optional action button."""
|
||||
try:
|
||||
cmd = [
|
||||
"osascript",
|
||||
"-e",
|
||||
f'''display notification "{message}" with title "{title}" subtitle "{self.app_name}"''',
|
||||
]
|
||||
|
||||
if sound:
|
||||
cmd[-1] += ' sound name "Glass"'
|
||||
|
||||
subprocess.run(cmd, check=True, capture_output=True)
|
||||
return True
|
||||
except subprocess.CalledProcessError as e:
|
||||
logging.error(f"Failed to send notification: {e}")
|
||||
return False
|
||||
|
||||
def show_summary_window(self, title: str, summary: str) -> bool:
|
||||
"""Display a summary window using AppleScript."""
|
||||
try:
|
||||
# Escape quotes in the summary
|
||||
escaped_summary = summary.replace('"', '\\"').replace("\n", "\\n")
|
||||
|
||||
applescript = f'''
|
||||
tell application "System Events"
|
||||
display dialog "{escaped_summary}" with title "{title}" buttons {{"Copy", "Close"}} default button "Close" with icon note giving up after 300
|
||||
set buttonPressed to button returned of result
|
||||
if buttonPressed is "Copy" then
|
||||
set the clipboard to "{escaped_summary}"
|
||||
end if
|
||||
end tell
|
||||
'''
|
||||
|
||||
subprocess.run(
|
||||
["osascript", "-e", applescript], check=True, capture_output=True
|
||||
)
|
||||
return True
|
||||
except subprocess.CalledProcessError as e:
|
||||
logging.error(f"Failed to show summary window: {e}")
|
||||
return False
|
||||
|
||||
def send_pipeline_failure_notification(
|
||||
self, project_name: str, pipeline_id: int, job_count: int
|
||||
) -> bool:
|
||||
"""Send notification specifically for pipeline failures."""
|
||||
title = "GitLab Pipeline Failed"
|
||||
message = f"{project_name} Pipeline #{pipeline_id} - {job_count} failed job(s)"
|
||||
return self.send_notification(title, message, sound=True)
|
||||
|
||||
def show_failure_analysis(
|
||||
self, project_name: str, pipeline_id: int, analysis: str
|
||||
) -> bool:
|
||||
"""Show the AI analysis of pipeline failures."""
|
||||
title = f"Pipeline Analysis - {project_name} #{pipeline_id}"
|
||||
return self.show_summary_window(title, analysis)
|
||||
79
src/services/gitlab_monitor/openai_analyzer.py
Normal file
79
src/services/gitlab_monitor/openai_analyzer.py
Normal file
@@ -0,0 +1,79 @@
|
||||
import openai
|
||||
import logging
|
||||
from typing import Dict, List, Optional
|
||||
|
||||
|
||||
class OpenAIAnalyzer:
|
||||
"""OpenAI client for analyzing pipeline failure logs."""
|
||||
|
||||
def __init__(self, api_key: str, model: str = "gpt-4", max_tokens: int = 1000):
|
||||
self.client = openai.OpenAI(api_key=api_key)
|
||||
self.model = model
|
||||
self.max_tokens = max_tokens
|
||||
|
||||
def analyze_pipeline_failures(self, failed_jobs: List[Dict]) -> Optional[str]:
|
||||
"""Analyze pipeline failures and provide summary and fix suggestions."""
|
||||
if not failed_jobs:
|
||||
return None
|
||||
|
||||
# Prepare the analysis prompt
|
||||
analysis_prompt = self._build_analysis_prompt(failed_jobs)
|
||||
|
||||
try:
|
||||
response = self.client.chat.completions.create(
|
||||
model=self.model,
|
||||
messages=[
|
||||
{
|
||||
"role": "system",
|
||||
"content": "You are a senior DevOps engineer helping to diagnose CI/CD pipeline failures. Provide concise, actionable summaries and solutions.",
|
||||
},
|
||||
{"role": "user", "content": analysis_prompt},
|
||||
],
|
||||
max_tokens=self.max_tokens,
|
||||
temperature=0.1,
|
||||
)
|
||||
|
||||
return response.choices[0].message.content
|
||||
except Exception as e:
|
||||
logging.error(f"OpenAI analysis failed: {e}")
|
||||
return None
|
||||
|
||||
def _build_analysis_prompt(self, failed_jobs: List[Dict]) -> str:
|
||||
"""Build the analysis prompt for OpenAI."""
|
||||
prompt = """Analyze the following GitLab CI pipeline failures and provide:
|
||||
|
||||
1. A brief summary of what went wrong (2-3 sentences max)
|
||||
2. Specific fix recommendations for each job type
|
||||
3. Organize by job name/type for easy scanning
|
||||
|
||||
Failed Jobs:
|
||||
"""
|
||||
|
||||
for job in failed_jobs:
|
||||
job_name = job.get("name", "Unknown Job")
|
||||
job_stage = job.get("stage", "Unknown Stage")
|
||||
trace = job.get("trace", "No trace available")
|
||||
|
||||
# Truncate trace if too long (keep last 2000 chars for most relevant errors)
|
||||
if len(trace) > 2000:
|
||||
trace = "..." + trace[-2000:]
|
||||
|
||||
prompt += f"""
|
||||
## {job_name} (Stage: {job_stage})
|
||||
|
||||
```
|
||||
{trace}
|
||||
```
|
||||
|
||||
"""
|
||||
|
||||
prompt += """
|
||||
Please categorize fixes by job type:
|
||||
- **Linting/Formatting** (eslint, prettier, black, etc.): Quick syntax fixes
|
||||
- **Type Checking** (typescript, mypy, etc.): Type annotation issues
|
||||
- **Tests** (jest, pytest, etc.): Test failures requiring code analysis
|
||||
- **Build/Deploy**: Configuration or dependency issues
|
||||
|
||||
Format your response with clear headings and bullet points for quick scanning."""
|
||||
|
||||
return prompt
|
||||
0
src/services/godspeed/__init__.py
Normal file
0
src/services/godspeed/__init__.py
Normal file
136
src/services/godspeed/client.py
Normal file
136
src/services/godspeed/client.py
Normal file
@@ -0,0 +1,136 @@
|
||||
"""Godspeed API client for task and list management."""
|
||||
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import requests
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Any
|
||||
from datetime import datetime
|
||||
import urllib3
|
||||
|
||||
|
||||
class GodspeedClient:
|
||||
"""Client for interacting with the Godspeed API."""
|
||||
|
||||
BASE_URL = "https://api.godspeedapp.com"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
email: Optional[str] = None,
|
||||
password: Optional[str] = None,
|
||||
token: Optional[str] = None,
|
||||
):
|
||||
self.email = email
|
||||
self.password = password
|
||||
self.token = token
|
||||
self.session = requests.Session()
|
||||
|
||||
# Handle SSL verification bypass for corporate networks
|
||||
disable_ssl = os.getenv("GODSPEED_DISABLE_SSL_VERIFY", "").lower() == "true"
|
||||
if disable_ssl:
|
||||
self.session.verify = False
|
||||
# Suppress only the specific warning about unverified HTTPS requests
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
print("⚠️ SSL verification disabled for Godspeed API")
|
||||
|
||||
if token:
|
||||
self.session.headers.update({"Authorization": f"Bearer {token}"})
|
||||
elif email and password:
|
||||
self._authenticate()
|
||||
|
||||
def _authenticate(self) -> str:
|
||||
"""Authenticate and get access token."""
|
||||
if not self.email or not self.password:
|
||||
raise ValueError("Email and password required for authentication")
|
||||
|
||||
response = self.session.post(
|
||||
f"{self.BASE_URL}/sessions/sign_in",
|
||||
json={"email": self.email, "password": self.password},
|
||||
headers={"Content-Type": "application/json"},
|
||||
)
|
||||
response.raise_for_status()
|
||||
|
||||
data = response.json()
|
||||
if not data.get("success"):
|
||||
raise Exception("Authentication failed")
|
||||
|
||||
self.token = data["token"]
|
||||
self.session.headers.update({"Authorization": f"Bearer {self.token}"})
|
||||
return self.token
|
||||
|
||||
def get_lists(self) -> List[Dict[str, Any]]:
|
||||
"""Get all lists."""
|
||||
response = self.session.get(f"{self.BASE_URL}/lists")
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def get_tasks(
|
||||
self, list_id: Optional[str] = None, status: Optional[str] = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Get tasks with optional filtering."""
|
||||
params = {}
|
||||
if list_id:
|
||||
params["list_id"] = list_id
|
||||
if status:
|
||||
params["status"] = status
|
||||
|
||||
response = self.session.get(f"{self.BASE_URL}/tasks", params=params)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def get_task(self, task_id: str) -> Dict[str, Any]:
|
||||
"""Get a single task by ID."""
|
||||
response = self.session.get(f"{self.BASE_URL}/tasks/{task_id}")
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def create_task(
|
||||
self,
|
||||
title: str,
|
||||
list_id: Optional[str] = None,
|
||||
notes: Optional[str] = None,
|
||||
location: str = "end",
|
||||
**kwargs,
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a new task."""
|
||||
data = {"title": title, "location": location}
|
||||
|
||||
if list_id:
|
||||
data["list_id"] = list_id
|
||||
if notes:
|
||||
data["notes"] = notes
|
||||
|
||||
# Add any additional kwargs
|
||||
data.update(kwargs)
|
||||
|
||||
response = self.session.post(
|
||||
f"{self.BASE_URL}/tasks",
|
||||
json=data,
|
||||
headers={"Content-Type": "application/json"},
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def update_task(self, task_id: str, **kwargs) -> Dict[str, Any]:
|
||||
"""Update an existing task."""
|
||||
response = self.session.patch(
|
||||
f"{self.BASE_URL}/tasks/{task_id}",
|
||||
json=kwargs,
|
||||
headers={"Content-Type": "application/json"},
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def delete_task(self, task_id: str) -> None:
|
||||
"""Delete a task."""
|
||||
response = self.session.delete(f"{self.BASE_URL}/tasks/{task_id}")
|
||||
response.raise_for_status()
|
||||
|
||||
def complete_task(self, task_id: str) -> Dict[str, Any]:
|
||||
"""Mark a task as complete."""
|
||||
return self.update_task(task_id, is_complete=True)
|
||||
|
||||
def incomplete_task(self, task_id: str) -> Dict[str, Any]:
|
||||
"""Mark a task as incomplete."""
|
||||
return self.update_task(task_id, is_complete=False)
|
||||
87
src/services/godspeed/config.py
Normal file
87
src/services/godspeed/config.py
Normal file
@@ -0,0 +1,87 @@
|
||||
"""Configuration and credential management for Godspeed sync."""
|
||||
|
||||
import json
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import Optional, Dict, Any
|
||||
|
||||
|
||||
class GodspeedConfig:
|
||||
"""Manages configuration and credentials for Godspeed sync."""
|
||||
|
||||
def __init__(self, config_dir: Optional[Path] = None):
|
||||
if config_dir is None:
|
||||
config_dir = Path.home() / ".local" / "share" / "gtd-terminal-tools"
|
||||
|
||||
self.config_dir = Path(config_dir)
|
||||
self.config_file = self.config_dir / "godspeed_config.json"
|
||||
self.config = self._load_config()
|
||||
|
||||
def _load_config(self) -> Dict[str, Any]:
|
||||
"""Load configuration from file."""
|
||||
if self.config_file.exists():
|
||||
with open(self.config_file, "r") as f:
|
||||
return json.load(f)
|
||||
return {}
|
||||
|
||||
def _save_config(self):
|
||||
"""Save configuration to file."""
|
||||
self.config_dir.mkdir(parents=True, exist_ok=True)
|
||||
with open(self.config_file, "w") as f:
|
||||
json.dump(self.config, f, indent=2)
|
||||
|
||||
def get_email(self) -> Optional[str]:
|
||||
"""Get stored email or from environment."""
|
||||
return os.getenv("GODSPEED_EMAIL") or self.config.get("email")
|
||||
|
||||
def set_email(self, email: str):
|
||||
"""Store email in config."""
|
||||
self.config["email"] = email
|
||||
self._save_config()
|
||||
|
||||
def get_token(self) -> Optional[str]:
|
||||
"""Get stored token or from environment."""
|
||||
return os.getenv("GODSPEED_TOKEN") or self.config.get("token")
|
||||
|
||||
def set_token(self, token: str):
|
||||
"""Store token in config."""
|
||||
self.config["token"] = token
|
||||
self._save_config()
|
||||
|
||||
def get_sync_directory(self) -> Path:
|
||||
"""Get sync directory from config or environment."""
|
||||
sync_dir = os.getenv("GODSPEED_SYNC_DIR") or self.config.get("sync_directory")
|
||||
|
||||
if sync_dir:
|
||||
return Path(sync_dir)
|
||||
|
||||
# Default to ~/Documents/Godspeed or ~/.local/share/gtd-terminal-tools/godspeed
|
||||
home = Path.home()
|
||||
|
||||
# Try Documents first
|
||||
docs_dir = home / "Documents" / "Godspeed"
|
||||
if docs_dir.parent.exists():
|
||||
return docs_dir
|
||||
|
||||
# Fall back to data directory
|
||||
return home / ".local" / "share" / "gtd-terminal-tools" / "godspeed"
|
||||
|
||||
def set_sync_directory(self, sync_dir: Path):
|
||||
"""Store sync directory in config."""
|
||||
self.config["sync_directory"] = str(sync_dir)
|
||||
self._save_config()
|
||||
|
||||
def clear_credentials(self):
|
||||
"""Clear stored credentials."""
|
||||
self.config.pop("email", None)
|
||||
self.config.pop("token", None)
|
||||
self._save_config()
|
||||
|
||||
def get_all_settings(self) -> Dict[str, Any]:
|
||||
"""Get all current settings."""
|
||||
return {
|
||||
"email": self.get_email(),
|
||||
"has_token": bool(self.get_token()),
|
||||
"sync_directory": str(self.get_sync_directory()),
|
||||
"config_file": str(self.config_file),
|
||||
}
|
||||
395
src/services/godspeed/sync.py
Normal file
395
src/services/godspeed/sync.py
Normal file
@@ -0,0 +1,395 @@
|
||||
"""Two-way synchronization engine for Godspeed API and local markdown files."""
|
||||
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Set, Tuple
|
||||
from datetime import datetime
|
||||
|
||||
from .client import GodspeedClient
|
||||
|
||||
|
||||
class GodspeedSync:
|
||||
"""Handles bidirectional sync between Godspeed API and local markdown files."""
|
||||
|
||||
def __init__(self, client: GodspeedClient, sync_dir: Path):
|
||||
self.client = client
|
||||
self.sync_dir = Path(sync_dir)
|
||||
self.metadata_file = self.sync_dir / ".godspeed_metadata.json"
|
||||
self.metadata = self._load_metadata()
|
||||
|
||||
def _load_metadata(self) -> Dict:
|
||||
"""Load sync metadata from local file."""
|
||||
if self.metadata_file.exists():
|
||||
with open(self.metadata_file, "r") as f:
|
||||
return json.load(f)
|
||||
return {
|
||||
"task_mapping": {}, # local_id -> godspeed_id
|
||||
"list_mapping": {}, # list_name -> list_id
|
||||
"last_sync": None,
|
||||
}
|
||||
|
||||
def _save_metadata(self):
|
||||
"""Save sync metadata to local file."""
|
||||
self.sync_dir.mkdir(parents=True, exist_ok=True)
|
||||
with open(self.metadata_file, "w") as f:
|
||||
json.dump(self.metadata, f, indent=2)
|
||||
|
||||
def _sanitize_filename(self, name: str) -> str:
|
||||
"""Convert list name to safe filename."""
|
||||
# Replace special characters with underscores
|
||||
sanitized = re.sub(r'[<>:"/\\|?*]', "_", name)
|
||||
# Remove multiple underscores
|
||||
sanitized = re.sub(r"_+", "_", sanitized)
|
||||
# Strip leading/trailing underscores and spaces
|
||||
return sanitized.strip("_ ")
|
||||
|
||||
def _generate_local_id(self) -> str:
|
||||
"""Generate a unique local ID for tracking."""
|
||||
import uuid
|
||||
|
||||
return str(uuid.uuid4())[:8]
|
||||
|
||||
def _parse_task_line(self, line: str) -> Optional[Tuple[str, str, str, str]]:
|
||||
"""Parse a markdown task line and extract components.
|
||||
|
||||
Returns: (local_id, status, title, notes) or None if invalid
|
||||
status can be: 'incomplete', 'complete', or 'cleared'
|
||||
"""
|
||||
# Match patterns like:
|
||||
# - [ ] Task title <!-- id:abc123 -->
|
||||
# - [x] Completed task <!-- id:def456 -->
|
||||
# - [-] Cleared/cancelled task <!-- id:ghi789 -->
|
||||
# - [ ] Task with notes <!-- id:jkl012 --> Some notes here
|
||||
|
||||
task_pattern = r"^\s*-\s*\[([xX\s\-])\]\s*(.+?)(?:\s*<!--\s*id:(\w+)\s*-->)?\s*(?:\n\s*(.+))?$"
|
||||
match = re.match(task_pattern, line.strip(), re.MULTILINE | re.DOTALL)
|
||||
|
||||
if not match:
|
||||
return None
|
||||
|
||||
checkbox, title_and_maybe_notes, local_id, extra_notes = match.groups()
|
||||
|
||||
# Determine status from checkbox
|
||||
if checkbox.lower() == "x":
|
||||
status = "complete"
|
||||
elif checkbox == "-":
|
||||
status = "cleared"
|
||||
else:
|
||||
status = "incomplete"
|
||||
|
||||
# Split title and inline notes if present
|
||||
title_parts = title_and_maybe_notes.split("<!--")[0].strip()
|
||||
notes = extra_notes.strip() if extra_notes else ""
|
||||
|
||||
if not local_id:
|
||||
local_id = self._generate_local_id()
|
||||
|
||||
return local_id, status, title_parts, notes
|
||||
|
||||
def _format_task_line(
|
||||
self, local_id: str, status: str, title: str, notes: str = ""
|
||||
) -> str:
|
||||
"""Format a task as a markdown line with ID tracking."""
|
||||
if status == "complete":
|
||||
checkbox = "[x]"
|
||||
elif status == "cleared":
|
||||
checkbox = "[-]"
|
||||
else:
|
||||
checkbox = "[ ]"
|
||||
|
||||
line = f"- {checkbox} {title} <!-- id:{local_id} -->"
|
||||
if notes:
|
||||
line += f"\n {notes}"
|
||||
return line
|
||||
|
||||
def _read_list_file(self, list_path: Path) -> List[Tuple[str, str, str, str]]:
|
||||
"""Read and parse tasks from a markdown file."""
|
||||
if not list_path.exists():
|
||||
return []
|
||||
|
||||
tasks = []
|
||||
with open(list_path, "r", encoding="utf-8") as f:
|
||||
content = f.read()
|
||||
|
||||
# Split into potential task blocks
|
||||
lines = content.split("\n")
|
||||
current_task_lines = []
|
||||
|
||||
for line in lines:
|
||||
if line.strip().startswith("- ["):
|
||||
# Process previous task if exists
|
||||
if current_task_lines:
|
||||
task_block = "\n".join(current_task_lines)
|
||||
parsed = self._parse_task_line(task_block)
|
||||
if parsed:
|
||||
tasks.append(parsed)
|
||||
current_task_lines = []
|
||||
|
||||
current_task_lines = [line]
|
||||
elif current_task_lines and line.strip():
|
||||
# Continuation of current task (notes)
|
||||
current_task_lines.append(line)
|
||||
elif current_task_lines:
|
||||
# Empty line ends the current task
|
||||
task_block = "\n".join(current_task_lines)
|
||||
parsed = self._parse_task_line(task_block)
|
||||
if parsed:
|
||||
tasks.append(parsed)
|
||||
current_task_lines = []
|
||||
|
||||
# Process last task if exists
|
||||
if current_task_lines:
|
||||
task_block = "\n".join(current_task_lines)
|
||||
parsed = self._parse_task_line(task_block)
|
||||
if parsed:
|
||||
tasks.append(parsed)
|
||||
|
||||
return tasks
|
||||
|
||||
def _write_list_file(self, list_path: Path, tasks: List[Tuple[str, str, str, str]]):
|
||||
"""Write tasks to a markdown file."""
|
||||
list_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
with open(list_path, "w", encoding="utf-8") as f:
|
||||
for local_id, status, title, notes in tasks:
|
||||
f.write(self._format_task_line(local_id, status, title, notes))
|
||||
f.write("\n")
|
||||
|
||||
def download_from_api(self) -> None:
|
||||
"""Download all lists and tasks from Godspeed API to local files."""
|
||||
print("Downloading from Godspeed API...")
|
||||
|
||||
# Get all lists
|
||||
lists_data = self.client.get_lists()
|
||||
lists = (
|
||||
lists_data if isinstance(lists_data, list) else lists_data.get("lists", [])
|
||||
)
|
||||
|
||||
# Update list mapping
|
||||
for list_item in lists:
|
||||
list_name = list_item["name"]
|
||||
list_id = list_item["id"]
|
||||
self.metadata["list_mapping"][list_name] = list_id
|
||||
|
||||
# Get only incomplete tasks (hide completed/cleared from local files)
|
||||
all_tasks_data = self.client.get_tasks(status="incomplete")
|
||||
tasks = all_tasks_data.get("tasks", [])
|
||||
task_lists = all_tasks_data.get("lists", {})
|
||||
|
||||
# Group tasks by list
|
||||
tasks_by_list = {}
|
||||
for task in tasks:
|
||||
list_id = task.get("list_id")
|
||||
if list_id in task_lists:
|
||||
list_name = task_lists[list_id]["name"]
|
||||
else:
|
||||
# Find list name from our mapping
|
||||
list_name = None
|
||||
for name, lid in self.metadata["list_mapping"].items():
|
||||
if lid == list_id:
|
||||
list_name = name
|
||||
break
|
||||
if not list_name:
|
||||
list_name = "Unknown"
|
||||
|
||||
if list_name not in tasks_by_list:
|
||||
tasks_by_list[list_name] = []
|
||||
tasks_by_list[list_name].append(task)
|
||||
|
||||
# Create directory structure and files
|
||||
for list_name, list_tasks in tasks_by_list.items():
|
||||
safe_name = self._sanitize_filename(list_name)
|
||||
list_path = self.sync_dir / f"{safe_name}.md"
|
||||
|
||||
# Convert API tasks to our format
|
||||
local_tasks = []
|
||||
for task in list_tasks:
|
||||
# Find existing local ID or create new one
|
||||
godspeed_id = task["id"]
|
||||
local_id = None
|
||||
for lid, gid in self.metadata["task_mapping"].items():
|
||||
if gid == godspeed_id:
|
||||
local_id = lid
|
||||
break
|
||||
|
||||
if not local_id:
|
||||
local_id = self._generate_local_id()
|
||||
self.metadata["task_mapping"][local_id] = godspeed_id
|
||||
|
||||
# Convert API task status to our format
|
||||
is_complete = task.get("is_complete", False)
|
||||
is_cleared = task.get("is_cleared", False)
|
||||
|
||||
if is_cleared:
|
||||
status = "cleared"
|
||||
elif is_complete:
|
||||
status = "complete"
|
||||
else:
|
||||
status = "incomplete"
|
||||
|
||||
title = task["title"]
|
||||
notes = task.get("notes", "")
|
||||
|
||||
local_tasks.append((local_id, status, title, notes))
|
||||
|
||||
self._write_list_file(list_path, local_tasks)
|
||||
print(f" Downloaded {len(local_tasks)} tasks to {list_path}")
|
||||
|
||||
self.metadata["last_sync"] = datetime.now().isoformat()
|
||||
self._save_metadata()
|
||||
print(f"Download complete. Synced {len(tasks_by_list)} lists.")
|
||||
|
||||
def upload_to_api(self) -> None:
|
||||
"""Upload local markdown files to Godspeed API."""
|
||||
print("Uploading to Godspeed API...")
|
||||
|
||||
# Find all markdown files
|
||||
md_files = list(self.sync_dir.glob("*.md"))
|
||||
|
||||
for md_file in md_files:
|
||||
if md_file.name.startswith("."):
|
||||
continue # Skip hidden files
|
||||
|
||||
list_name = md_file.stem
|
||||
local_tasks = self._read_list_file(md_file)
|
||||
|
||||
# Get or create list ID
|
||||
list_id = self.metadata["list_mapping"].get(list_name)
|
||||
if not list_id:
|
||||
print(
|
||||
f" Warning: No list ID found for '{list_name}', tasks will go to Inbox"
|
||||
)
|
||||
list_id = None
|
||||
|
||||
for local_id, status, title, notes in local_tasks:
|
||||
# Skip tasks with empty titles
|
||||
if not title or not title.strip():
|
||||
print(f" Skipping task with empty title (id: {local_id})")
|
||||
continue
|
||||
|
||||
godspeed_id = self.metadata["task_mapping"].get(local_id)
|
||||
|
||||
if godspeed_id:
|
||||
# Update existing task
|
||||
try:
|
||||
update_data = {"title": title.strip()}
|
||||
|
||||
# Handle status conversion to API format
|
||||
if status == "complete":
|
||||
update_data["is_complete"] = True
|
||||
update_data["is_cleared"] = False
|
||||
elif status == "cleared":
|
||||
# Note: API requires task to be complete before clearing
|
||||
update_data["is_complete"] = True
|
||||
update_data["is_cleared"] = True
|
||||
else: # incomplete
|
||||
update_data["is_complete"] = False
|
||||
update_data["is_cleared"] = False
|
||||
|
||||
if notes and notes.strip():
|
||||
update_data["notes"] = notes.strip()
|
||||
|
||||
self.client.update_task(godspeed_id, **update_data)
|
||||
|
||||
action = {
|
||||
"complete": "completed",
|
||||
"cleared": "cleared",
|
||||
"incomplete": "reopened",
|
||||
}[status]
|
||||
print(f" Updated task ({action}): {title}")
|
||||
except Exception as e:
|
||||
print(f" Error updating task '{title}': {e}")
|
||||
else:
|
||||
# Create new task
|
||||
try:
|
||||
create_data = {
|
||||
"title": title.strip(),
|
||||
"list_id": list_id,
|
||||
}
|
||||
|
||||
# Only add notes if they exist and are not empty
|
||||
if notes and notes.strip():
|
||||
create_data["notes"] = notes.strip()
|
||||
|
||||
print(f" Creating task: '{title}' with data: {create_data}")
|
||||
response = self.client.create_task(**create_data)
|
||||
print(f" API response: {response}")
|
||||
|
||||
# Handle different response formats
|
||||
if isinstance(response, dict):
|
||||
if "id" in response:
|
||||
new_godspeed_id = response["id"]
|
||||
elif "task" in response and "id" in response["task"]:
|
||||
new_godspeed_id = response["task"]["id"]
|
||||
else:
|
||||
print(
|
||||
f" Warning: No ID found in response: {response}"
|
||||
)
|
||||
continue
|
||||
else:
|
||||
print(
|
||||
f" Warning: Unexpected response format: {response}"
|
||||
)
|
||||
continue
|
||||
|
||||
self.metadata["task_mapping"][local_id] = new_godspeed_id
|
||||
|
||||
# Set status if not incomplete
|
||||
if status == "complete":
|
||||
self.client.update_task(new_godspeed_id, is_complete=True)
|
||||
print(f" Created completed task: {title}")
|
||||
elif status == "cleared":
|
||||
# Mark complete first, then clear
|
||||
self.client.update_task(
|
||||
new_godspeed_id, is_complete=True, is_cleared=True
|
||||
)
|
||||
print(f" Created cleared task: {title}")
|
||||
else:
|
||||
print(f" Created task: {title}")
|
||||
except Exception as e:
|
||||
print(f" Error creating task '{title}': {e}")
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
|
||||
self.metadata["last_sync"] = datetime.now().isoformat()
|
||||
self._save_metadata()
|
||||
print("Upload complete.")
|
||||
|
||||
def sync_bidirectional(self) -> None:
|
||||
"""Perform a full bidirectional sync."""
|
||||
print("Starting bidirectional sync...")
|
||||
|
||||
# Download first to get latest state
|
||||
self.download_from_api()
|
||||
|
||||
# Then upload any local changes
|
||||
self.upload_to_api()
|
||||
|
||||
print("Bidirectional sync complete.")
|
||||
|
||||
def list_local_files(self) -> List[Path]:
|
||||
"""List all markdown files in sync directory."""
|
||||
if not self.sync_dir.exists():
|
||||
return []
|
||||
return list(self.sync_dir.glob("*.md"))
|
||||
|
||||
def get_sync_status(self) -> Dict:
|
||||
"""Get current sync status and statistics."""
|
||||
local_files = self.list_local_files()
|
||||
|
||||
total_local_tasks = 0
|
||||
for file_path in local_files:
|
||||
tasks = self._read_list_file(file_path)
|
||||
total_local_tasks += len(tasks)
|
||||
|
||||
return {
|
||||
"sync_directory": str(self.sync_dir),
|
||||
"local_files": len(local_files),
|
||||
"total_local_tasks": total_local_tasks,
|
||||
"tracked_tasks": len(self.metadata["task_mapping"]),
|
||||
"tracked_lists": len(self.metadata["list_mapping"]),
|
||||
"last_sync": self.metadata.get("last_sync"),
|
||||
}
|
||||
@@ -1,15 +1,30 @@
|
||||
"""
|
||||
Authentication module for Microsoft Graph API.
|
||||
"""
|
||||
|
||||
import os
|
||||
import msal
|
||||
import logging
|
||||
from rich import print
|
||||
from rich.panel import Panel
|
||||
|
||||
# Comprehensive logging suppression for authentication-related libraries
|
||||
logging.getLogger("msal").setLevel(logging.ERROR)
|
||||
logging.getLogger("urllib3").setLevel(logging.ERROR)
|
||||
logging.getLogger("requests").setLevel(logging.ERROR)
|
||||
logging.getLogger("requests_oauthlib").setLevel(logging.ERROR)
|
||||
logging.getLogger("aiohttp").setLevel(logging.ERROR)
|
||||
logging.getLogger("aiohttp.access").setLevel(logging.ERROR)
|
||||
logging.getLogger("asyncio").setLevel(logging.ERROR)
|
||||
logging.getLogger("azure").setLevel(logging.ERROR)
|
||||
logging.getLogger("azure.core").setLevel(logging.ERROR)
|
||||
|
||||
|
||||
def ensure_directory_exists(path):
|
||||
if not os.path.exists(path):
|
||||
os.makedirs(path)
|
||||
|
||||
|
||||
def get_access_token(scopes):
|
||||
"""
|
||||
Authenticate with Microsoft Graph API and obtain an access token.
|
||||
@@ -26,43 +41,75 @@ def get_access_token(scopes):
|
||||
Exception: If authentication fails.
|
||||
"""
|
||||
# Read Azure app credentials from environment variables
|
||||
client_id = os.getenv('AZURE_CLIENT_ID')
|
||||
tenant_id = os.getenv('AZURE_TENANT_ID')
|
||||
client_id = os.getenv("AZURE_CLIENT_ID")
|
||||
tenant_id = os.getenv("AZURE_TENANT_ID")
|
||||
|
||||
if not client_id or not tenant_id:
|
||||
raise ValueError("Please set the AZURE_CLIENT_ID and AZURE_TENANT_ID environment variables.")
|
||||
raise ValueError(
|
||||
"Please set the AZURE_CLIENT_ID and AZURE_TENANT_ID environment variables."
|
||||
)
|
||||
|
||||
# Token cache
|
||||
cache = msal.SerializableTokenCache()
|
||||
cache_file = 'token_cache.bin'
|
||||
cache_file = "token_cache.bin"
|
||||
|
||||
if os.path.exists(cache_file):
|
||||
cache.deserialize(open(cache_file, 'r').read())
|
||||
cache.deserialize(open(cache_file, "r").read())
|
||||
|
||||
# Authentication
|
||||
authority = f'https://login.microsoftonline.com/{tenant_id}'
|
||||
app = msal.PublicClientApplication(client_id, authority=authority, token_cache=cache)
|
||||
authority = f"https://login.microsoftonline.com/{tenant_id}"
|
||||
app = msal.PublicClientApplication(
|
||||
client_id, authority=authority, token_cache=cache
|
||||
)
|
||||
accounts = app.get_accounts()
|
||||
|
||||
token_response = None
|
||||
|
||||
# Try silent authentication first
|
||||
if accounts:
|
||||
token_response = app.acquire_token_silent(scopes, account=accounts[0])
|
||||
else:
|
||||
|
||||
# If silent auth failed or no accounts, clear cache and do device flow
|
||||
if not token_response or "access_token" not in token_response:
|
||||
# Clear the cache to force fresh authentication
|
||||
if os.path.exists(cache_file):
|
||||
os.remove(cache_file)
|
||||
cache = msal.SerializableTokenCache() # Create new empty cache
|
||||
app = msal.PublicClientApplication(
|
||||
client_id, authority=authority, token_cache=cache
|
||||
)
|
||||
|
||||
flow = app.initiate_device_flow(scopes=scopes)
|
||||
if 'user_code' not in flow:
|
||||
if "user_code" not in flow:
|
||||
raise Exception("Failed to create device flow")
|
||||
|
||||
print(Panel(flow['message'], border_style="magenta", padding=2, title="MSAL Login Flow Link"))
|
||||
print(
|
||||
Panel(
|
||||
flow["message"],
|
||||
border_style="magenta",
|
||||
padding=2,
|
||||
title="MSAL Login Flow Link",
|
||||
)
|
||||
)
|
||||
|
||||
token_response = app.acquire_token_by_device_flow(flow)
|
||||
|
||||
if 'access_token' not in token_response:
|
||||
raise Exception("Failed to acquire token")
|
||||
if token_response is None:
|
||||
raise Exception("Token response is None - authentication failed")
|
||||
|
||||
if "access_token" not in token_response:
|
||||
error_description = token_response.get("error_description", "Unknown error")
|
||||
error_code = token_response.get("error", "unknown_error")
|
||||
raise Exception(f"Failed to acquire token - {error_code}: {error_description}")
|
||||
|
||||
# Save token cache
|
||||
with open(cache_file, 'w') as f:
|
||||
with open(cache_file, "w") as f:
|
||||
f.write(cache.serialize())
|
||||
|
||||
access_token = token_response['access_token']
|
||||
headers = {'Authorization': f'Bearer {access_token}', 'Prefer': 'outlook.body-content-type="text",IdType="ImmutableId"'}
|
||||
access_token = token_response["access_token"]
|
||||
headers = {
|
||||
"Authorization": f"Bearer {access_token}",
|
||||
"Prefer": 'outlook.body-content-type="text",IdType="ImmutableId"',
|
||||
}
|
||||
|
||||
return access_token, headers
|
||||
|
||||
@@ -3,9 +3,13 @@ Calendar operations for Microsoft Graph API.
|
||||
"""
|
||||
|
||||
import os
|
||||
import json
|
||||
import re
|
||||
import glob
|
||||
from datetime import datetime, timedelta
|
||||
from dateutil import parser
|
||||
|
||||
from .client import fetch_with_aiohttp
|
||||
from .client import fetch_with_aiohttp, post_with_aiohttp, delete_with_aiohttp
|
||||
|
||||
|
||||
async def fetch_calendar_events(
|
||||
@@ -40,7 +44,7 @@ async def fetch_calendar_events(
|
||||
calendar_url = (
|
||||
f"https://graph.microsoft.com/v1.0/me/calendarView?"
|
||||
f"startDateTime={start_date_str}&endDateTime={end_date_str}&"
|
||||
f"$select=id,subject,organizer,start,end,location,isAllDay,showAs,sensitivity&$count=true"
|
||||
f"$select=id,subject,organizer,start,end,location,isAllDay,showAs,sensitivity,iCalUId,lastModifiedDateTime&$count=true"
|
||||
)
|
||||
|
||||
events = []
|
||||
@@ -59,3 +63,408 @@ async def fetch_calendar_events(
|
||||
# Return events and total count
|
||||
total_count = response_data.get("@odata.count", len(events))
|
||||
return events, total_count
|
||||
|
||||
|
||||
def parse_ical_file(file_path):
|
||||
"""
|
||||
Parse a single iCalendar file and extract event data.
|
||||
|
||||
Args:
|
||||
file_path (str): Path to the .ics file
|
||||
|
||||
Returns:
|
||||
dict: Event data or None if parsing fails
|
||||
"""
|
||||
try:
|
||||
with open(file_path, "r", encoding="utf-8") as f:
|
||||
content = f.read()
|
||||
|
||||
event_data = {}
|
||||
in_event = False
|
||||
|
||||
for line in content.split("\n"):
|
||||
line = line.strip()
|
||||
|
||||
if line == "BEGIN:VEVENT":
|
||||
in_event = True
|
||||
continue
|
||||
elif line == "END:VEVENT":
|
||||
break
|
||||
elif not in_event:
|
||||
continue
|
||||
|
||||
if ":" in line:
|
||||
key, value = line.split(":", 1)
|
||||
|
||||
# Handle special cases
|
||||
if key == "UID":
|
||||
event_data["uid"] = value
|
||||
elif key == "SUMMARY":
|
||||
event_data["subject"] = (
|
||||
value.replace("\\,", ",")
|
||||
.replace("\\;", ";")
|
||||
.replace("\\n", "\n")
|
||||
)
|
||||
elif key.startswith("DTSTART"):
|
||||
event_data["start"] = _parse_ical_datetime(key, value)
|
||||
elif key.startswith("DTEND"):
|
||||
event_data["end"] = _parse_ical_datetime(key, value)
|
||||
elif key == "LOCATION":
|
||||
event_data["location"] = value.replace("\\,", ",").replace(
|
||||
"\\;", ";"
|
||||
)
|
||||
elif key == "DESCRIPTION":
|
||||
event_data["description"] = (
|
||||
value.replace("\\,", ",")
|
||||
.replace("\\;", ";")
|
||||
.replace("\\n", "\n")
|
||||
)
|
||||
|
||||
# Get file modification time for tracking local changes
|
||||
event_data["local_mtime"] = os.path.getmtime(file_path)
|
||||
event_data["local_file"] = file_path
|
||||
|
||||
return event_data if "uid" in event_data else None
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error parsing {file_path}: {e}")
|
||||
return None
|
||||
|
||||
|
||||
def _parse_ical_datetime(key, value):
|
||||
"""Parse iCalendar datetime format."""
|
||||
try:
|
||||
if "TZID=" in key:
|
||||
# Extract timezone info if present
|
||||
tz_part = (
|
||||
key.split("TZID=")[1].split(":")[0]
|
||||
if ":" in key
|
||||
else key.split("TZID=")[1]
|
||||
)
|
||||
# For now, treat as naive datetime and let dateutil handle it
|
||||
return parser.parse(value.replace("Z", ""))
|
||||
elif value.endswith("Z"):
|
||||
# UTC time
|
||||
return parser.parse(value)
|
||||
else:
|
||||
# Naive datetime
|
||||
return parser.parse(value.replace("Z", ""))
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
def get_local_calendar_events(vdir_path):
|
||||
"""
|
||||
Get all local calendar events from vdir format.
|
||||
|
||||
Args:
|
||||
vdir_path (str): Path to vdir calendar directory
|
||||
|
||||
Returns:
|
||||
dict: Dictionary mapping UIDs to event data
|
||||
"""
|
||||
local_events = {}
|
||||
|
||||
if not os.path.exists(vdir_path):
|
||||
return local_events
|
||||
|
||||
ics_files = glob.glob(os.path.join(vdir_path, "*.ics"))
|
||||
|
||||
for file_path in ics_files:
|
||||
event_data = parse_ical_file(file_path)
|
||||
if event_data and "uid" in event_data:
|
||||
local_events[event_data["uid"]] = event_data
|
||||
|
||||
return local_events
|
||||
|
||||
|
||||
async def create_calendar_event(headers, event_data):
|
||||
"""
|
||||
Create a new calendar event on Microsoft Graph.
|
||||
|
||||
Args:
|
||||
headers (dict): Authentication headers
|
||||
event_data (dict): Event data from local file
|
||||
|
||||
Returns:
|
||||
dict: Created event response or None if failed
|
||||
"""
|
||||
try:
|
||||
# Convert local event data to Microsoft Graph format
|
||||
graph_event = {
|
||||
"subject": event_data.get("subject", "Untitled Event"),
|
||||
"start": {"dateTime": event_data["start"].isoformat(), "timeZone": "UTC"},
|
||||
"end": {"dateTime": event_data["end"].isoformat(), "timeZone": "UTC"},
|
||||
}
|
||||
|
||||
if event_data.get("location"):
|
||||
graph_event["location"] = {"displayName": event_data["location"]}
|
||||
|
||||
if event_data.get("description"):
|
||||
graph_event["body"] = {
|
||||
"contentType": "text",
|
||||
"content": event_data["description"],
|
||||
}
|
||||
|
||||
# Create the event
|
||||
create_url = "https://graph.microsoft.com/v1.0/me/events"
|
||||
status = await post_with_aiohttp(create_url, headers, graph_event)
|
||||
|
||||
if status == 201:
|
||||
return graph_event
|
||||
else:
|
||||
print(f"Failed to create event: HTTP {status}")
|
||||
return None
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error creating event: {e}")
|
||||
return None
|
||||
|
||||
|
||||
async def delete_calendar_event_by_uid(headers, ical_uid):
|
||||
"""
|
||||
Delete a calendar event by its iCalUId.
|
||||
|
||||
Args:
|
||||
headers (dict): Authentication headers
|
||||
ical_uid (str): The iCalUId of the event to delete
|
||||
|
||||
Returns:
|
||||
bool: True if deleted successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
# First, find the event by iCalUId
|
||||
search_url = f"https://graph.microsoft.com/v1.0/me/events?$filter=iCalUId eq '{ical_uid}'"
|
||||
response = await fetch_with_aiohttp(search_url, headers)
|
||||
|
||||
events = response.get("value", [])
|
||||
if not events:
|
||||
print(f"Event with UID {ical_uid} not found on server")
|
||||
return False
|
||||
|
||||
# Delete the event using its Graph ID
|
||||
event_id = events[0]["id"]
|
||||
delete_url = f"https://graph.microsoft.com/v1.0/me/events/{event_id}"
|
||||
status = await delete_with_aiohttp(delete_url, headers)
|
||||
|
||||
if status == 204:
|
||||
print(f"Successfully deleted event with UID {ical_uid}")
|
||||
return True
|
||||
else:
|
||||
print(f"Failed to delete event: HTTP {status}")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error deleting event: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def get_sync_timestamp_file(vdir_path):
|
||||
"""Get the path to the sync timestamp file."""
|
||||
return os.path.join(vdir_path, ".sync_timestamp")
|
||||
|
||||
|
||||
def get_last_sync_time(vdir_path):
|
||||
"""
|
||||
Get the timestamp of the last sync.
|
||||
|
||||
Args:
|
||||
vdir_path (str): Path to vdir calendar directory
|
||||
|
||||
Returns:
|
||||
float: Unix timestamp of last sync, or 0 if never synced
|
||||
"""
|
||||
timestamp_file = get_sync_timestamp_file(vdir_path)
|
||||
if os.path.exists(timestamp_file):
|
||||
try:
|
||||
with open(timestamp_file, "r") as f:
|
||||
return float(f.read().strip())
|
||||
except (ValueError, IOError):
|
||||
return 0
|
||||
return 0
|
||||
|
||||
|
||||
def update_sync_timestamp(vdir_path):
|
||||
"""
|
||||
Update the sync timestamp to current time.
|
||||
|
||||
Args:
|
||||
vdir_path (str): Path to vdir calendar directory
|
||||
"""
|
||||
timestamp_file = get_sync_timestamp_file(vdir_path)
|
||||
try:
|
||||
with open(timestamp_file, "w") as f:
|
||||
f.write(str(datetime.now().timestamp()))
|
||||
except IOError as e:
|
||||
print(f"Warning: Could not update sync timestamp: {e}")
|
||||
|
||||
|
||||
def detect_deleted_events(vdir_path):
|
||||
"""
|
||||
Detect events that have been deleted from vdir since last sync.
|
||||
Uses sync state and file modification times to determine deletions.
|
||||
|
||||
Args:
|
||||
vdir_path (str): Path to vdir calendar directory
|
||||
|
||||
Returns:
|
||||
list: List of UIDs that were deleted locally
|
||||
"""
|
||||
if not os.path.exists(vdir_path):
|
||||
return []
|
||||
|
||||
state_file = os.path.join(vdir_path, ".sync_state.json")
|
||||
last_sync_time = get_last_sync_time(vdir_path)
|
||||
|
||||
# Load previous sync state
|
||||
previous_state = {}
|
||||
if os.path.exists(state_file):
|
||||
try:
|
||||
with open(state_file, "r") as f:
|
||||
previous_state = json.load(f)
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
if not previous_state:
|
||||
return [] # No previous state to compare against
|
||||
|
||||
# Get current local events
|
||||
current_local_events = get_local_calendar_events(vdir_path)
|
||||
|
||||
deleted_events = []
|
||||
|
||||
# Check each event from previous state
|
||||
for uid in previous_state:
|
||||
if uid not in current_local_events:
|
||||
# Event is no longer in local files
|
||||
# Check if the vdir has been modified since last sync
|
||||
# This ensures we only delete events that were intentionally removed
|
||||
vdir_mtime = os.path.getmtime(vdir_path)
|
||||
if vdir_mtime > last_sync_time:
|
||||
deleted_events.append(uid)
|
||||
|
||||
return deleted_events
|
||||
|
||||
|
||||
async def sync_local_calendar_changes(
|
||||
headers, vdir_path, progress, task_id, dry_run=False
|
||||
):
|
||||
"""
|
||||
Sync local calendar changes (new events and deletions) to Microsoft Graph.
|
||||
|
||||
Args:
|
||||
headers (dict): Authentication headers
|
||||
vdir_path (str): Path to local vdir calendar directory
|
||||
progress: Progress instance for updates
|
||||
task_id: Progress task ID
|
||||
dry_run (bool): If True, only report what would be done
|
||||
|
||||
Returns:
|
||||
tuple: (created_count, deleted_count)
|
||||
"""
|
||||
if not os.path.exists(vdir_path):
|
||||
progress.console.print(
|
||||
f"[yellow]Local calendar directory not found: {vdir_path}[/yellow]"
|
||||
)
|
||||
return 0, 0
|
||||
|
||||
# Track state file for knowing what was previously synced
|
||||
state_file = os.path.join(vdir_path, ".sync_state.json")
|
||||
|
||||
# Load previous sync state
|
||||
previous_state = {}
|
||||
if os.path.exists(state_file):
|
||||
try:
|
||||
with open(state_file, "r") as f:
|
||||
previous_state = json.load(f)
|
||||
except Exception as e:
|
||||
progress.console.print(f"[yellow]Could not load sync state: {e}[/yellow]")
|
||||
|
||||
# Detect deleted events using enhanced detection
|
||||
deleted_events = detect_deleted_events(vdir_path)
|
||||
|
||||
# Get current local events
|
||||
current_local_events = get_local_calendar_events(vdir_path)
|
||||
|
||||
# Get current remote events to avoid duplicates
|
||||
try:
|
||||
remote_events, _ = await fetch_calendar_events(
|
||||
headers, days_back=30, days_forward=90
|
||||
)
|
||||
remote_uids = {
|
||||
event.get("iCalUId", event.get("id", "")) for event in remote_events
|
||||
}
|
||||
except Exception as e:
|
||||
progress.console.print(f"[red]Error fetching remote events: {e}[/red]")
|
||||
return 0, 0
|
||||
|
||||
created_count = 0
|
||||
deleted_count = 0
|
||||
|
||||
# Find new local events (not in previous state and not on server)
|
||||
new_local_events = []
|
||||
for uid, event_data in current_local_events.items():
|
||||
if uid not in previous_state and uid not in remote_uids:
|
||||
# This is a new local event
|
||||
new_local_events.append((uid, event_data))
|
||||
|
||||
progress.update(task_id, total=len(new_local_events) + len(deleted_events))
|
||||
|
||||
# Handle deletions FIRST to clean up server before adding new events
|
||||
for uid in deleted_events:
|
||||
if dry_run:
|
||||
progress.console.print(f"[DRY-RUN] Would delete event with UID: {uid}")
|
||||
else:
|
||||
result = await delete_calendar_event_by_uid(headers, uid)
|
||||
if result:
|
||||
deleted_count += 1
|
||||
progress.console.print(f"[green]Deleted event with UID: {uid}[/green]")
|
||||
else:
|
||||
progress.console.print(
|
||||
f"[red]Failed to delete event with UID: {uid}[/red]"
|
||||
)
|
||||
|
||||
progress.advance(task_id)
|
||||
|
||||
# Create new events on server
|
||||
for uid, event_data in new_local_events:
|
||||
if dry_run:
|
||||
progress.console.print(
|
||||
f"[DRY-RUN] Would create event: {event_data.get('subject', 'Untitled')}"
|
||||
)
|
||||
else:
|
||||
result = await create_calendar_event(headers, event_data)
|
||||
if result:
|
||||
created_count += 1
|
||||
progress.console.print(
|
||||
f"[green]Created event: {event_data.get('subject', 'Untitled')}[/green]"
|
||||
)
|
||||
else:
|
||||
progress.console.print(
|
||||
f"[red]Failed to create event: {event_data.get('subject', 'Untitled')}[/red]"
|
||||
)
|
||||
|
||||
progress.advance(task_id)
|
||||
|
||||
# Update sync state and timestamp
|
||||
if not dry_run:
|
||||
new_state = {
|
||||
uid: event_data.get("local_mtime", 0)
|
||||
for uid, event_data in current_local_events.items()
|
||||
}
|
||||
try:
|
||||
with open(state_file, "w") as f:
|
||||
json.dump(new_state, f, indent=2)
|
||||
|
||||
# Update sync timestamp to mark when this sync completed
|
||||
update_sync_timestamp(vdir_path)
|
||||
|
||||
except Exception as e:
|
||||
progress.console.print(f"[yellow]Could not save sync state: {e}[/yellow]")
|
||||
|
||||
if created_count > 0 or deleted_count > 0:
|
||||
progress.console.print(
|
||||
f"[cyan]Local calendar sync completed: {created_count} created, {deleted_count} deleted[/cyan]"
|
||||
)
|
||||
|
||||
return created_count, deleted_count
|
||||
|
||||
@@ -1,16 +1,75 @@
|
||||
"""
|
||||
HTTP client for Microsoft Graph API.
|
||||
"""
|
||||
|
||||
import aiohttp
|
||||
import asyncio
|
||||
import logging
|
||||
import orjson
|
||||
|
||||
# Define a global semaphore for throttling
|
||||
semaphore = asyncio.Semaphore(4)
|
||||
# Suppress debug logging from HTTP libraries
|
||||
logging.getLogger("aiohttp").setLevel(logging.ERROR)
|
||||
logging.getLogger("aiohttp.access").setLevel(logging.ERROR)
|
||||
logging.getLogger("urllib3").setLevel(logging.ERROR)
|
||||
logging.getLogger("asyncio").setLevel(logging.ERROR)
|
||||
|
||||
# Define a global semaphore for throttling - reduced for better compliance
|
||||
semaphore = asyncio.Semaphore(2)
|
||||
|
||||
|
||||
async def _handle_throttling_retry(func, *args, max_retries=3):
|
||||
"""Handle 429 throttling and 401 authentication errors with exponential backoff retry."""
|
||||
for attempt in range(max_retries):
|
||||
try:
|
||||
return await func(*args)
|
||||
except Exception as e:
|
||||
error_str = str(e)
|
||||
if (
|
||||
"429" in error_str
|
||||
or "InvalidAuthenticationToken" in error_str
|
||||
or "401" in error_str
|
||||
) and attempt < max_retries - 1:
|
||||
wait_time = (2**attempt) + 1 # Exponential backoff: 2, 5, 9 seconds
|
||||
if "429" in error_str:
|
||||
print(
|
||||
f"Rate limited, waiting {wait_time}s before retry {attempt + 1}/{max_retries}"
|
||||
)
|
||||
elif "InvalidAuthenticationToken" in error_str or "401" in error_str:
|
||||
print(
|
||||
f"Authentication failed (token expired), refreshing token and retrying in {wait_time}s (attempt {attempt + 1}/{max_retries})"
|
||||
)
|
||||
# Force re-authentication by clearing cache and getting new token
|
||||
import os
|
||||
|
||||
cache_file = "token_cache.bin"
|
||||
if os.path.exists(cache_file):
|
||||
os.remove(cache_file)
|
||||
# Re-import and call get_access_token to refresh
|
||||
from src.services.microsoft_graph.auth import get_access_token
|
||||
|
||||
# We need to get the scopes from somewhere - for now assume standard scopes
|
||||
scopes = [
|
||||
"https://graph.microsoft.com/Calendars.Read",
|
||||
"https://graph.microsoft.com/Mail.ReadWrite",
|
||||
]
|
||||
try:
|
||||
new_token, new_headers = get_access_token(scopes)
|
||||
# Update the headers in args - this is a bit hacky but should work
|
||||
if len(args) > 1 and isinstance(args[1], dict):
|
||||
args = list(args)
|
||||
args[1] = new_headers
|
||||
args = tuple(args)
|
||||
except Exception as auth_error:
|
||||
print(f"Failed to refresh token: {auth_error}")
|
||||
raise e # Re-raise original error
|
||||
await asyncio.sleep(wait_time)
|
||||
continue
|
||||
raise e
|
||||
|
||||
|
||||
async def fetch_with_aiohttp(url, headers):
|
||||
"""
|
||||
Fetch data from Microsoft Graph API.
|
||||
Fetch data from Microsoft Graph API with throttling and retry logic.
|
||||
|
||||
Args:
|
||||
url (str): The URL to fetch data from.
|
||||
@@ -20,23 +79,37 @@ async def fetch_with_aiohttp(url, headers):
|
||||
dict: JSON response data.
|
||||
|
||||
Raises:
|
||||
Exception: If the request fails.
|
||||
Exception: If the request fails after retries.
|
||||
"""
|
||||
return await _handle_throttling_retry(_fetch_impl, url, headers)
|
||||
|
||||
|
||||
async def _fetch_impl(url, headers):
|
||||
"""Internal fetch implementation."""
|
||||
async with semaphore:
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.get(url, headers=headers) as response:
|
||||
if response.status != 200:
|
||||
raise Exception(f"Failed to fetch {url}: {response.status} {await response.text()}")
|
||||
if response.status in [401, 429]:
|
||||
# Let the retry handler deal with authentication and throttling
|
||||
response_text = await response.text()
|
||||
raise Exception(
|
||||
f"Failed to fetch {url}: {response.status} {response_text}"
|
||||
)
|
||||
elif response.status != 200:
|
||||
raise Exception(
|
||||
f"Failed to fetch {url}: {response.status} {await response.text()}"
|
||||
)
|
||||
raw_bytes = await response.read()
|
||||
content_length = response.headers.get('Content-Length')
|
||||
content_length = response.headers.get("Content-Length")
|
||||
if content_length and len(raw_bytes) != int(content_length):
|
||||
print("Warning: Incomplete response received!")
|
||||
return None
|
||||
return orjson.loads(raw_bytes)
|
||||
|
||||
|
||||
async def post_with_aiohttp(url, headers, json_data):
|
||||
"""
|
||||
Post data to Microsoft Graph API.
|
||||
Post data to Microsoft Graph API with throttling and retry logic.
|
||||
|
||||
Args:
|
||||
url (str): The URL to post data to.
|
||||
@@ -46,14 +119,25 @@ async def post_with_aiohttp(url, headers, json_data):
|
||||
Returns:
|
||||
int: HTTP status code.
|
||||
"""
|
||||
return await _handle_throttling_retry(_post_impl, url, headers, json_data)
|
||||
|
||||
|
||||
async def _post_impl(url, headers, json_data):
|
||||
"""Internal post implementation."""
|
||||
async with semaphore:
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.post(url, headers=headers, json=json_data) as response:
|
||||
if response.status in [401, 429]:
|
||||
response_text = await response.text()
|
||||
raise Exception(
|
||||
f"Failed to post {url}: {response.status} {response_text}"
|
||||
)
|
||||
return response.status
|
||||
|
||||
|
||||
async def patch_with_aiohttp(url, headers, json_data):
|
||||
"""
|
||||
Patch data to Microsoft Graph API.
|
||||
Patch data to Microsoft Graph API with throttling and retry logic.
|
||||
|
||||
Args:
|
||||
url (str): The URL to patch data to.
|
||||
@@ -63,14 +147,25 @@ async def patch_with_aiohttp(url, headers, json_data):
|
||||
Returns:
|
||||
int: HTTP status code.
|
||||
"""
|
||||
return await _handle_throttling_retry(_patch_impl, url, headers, json_data)
|
||||
|
||||
|
||||
async def _patch_impl(url, headers, json_data):
|
||||
"""Internal patch implementation."""
|
||||
async with semaphore:
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.patch(url, headers=headers, json=json_data) as response:
|
||||
if response.status in [401, 429]:
|
||||
response_text = await response.text()
|
||||
raise Exception(
|
||||
f"Failed to patch {url}: {response.status} {response_text}"
|
||||
)
|
||||
return response.status
|
||||
|
||||
|
||||
async def delete_with_aiohttp(url, headers):
|
||||
"""
|
||||
Delete data from Microsoft Graph API.
|
||||
Delete data from Microsoft Graph API with throttling and retry logic.
|
||||
|
||||
Args:
|
||||
url (str): The URL to delete data from.
|
||||
@@ -79,7 +174,53 @@ async def delete_with_aiohttp(url, headers):
|
||||
Returns:
|
||||
int: HTTP status code.
|
||||
"""
|
||||
return await _handle_throttling_retry(_delete_impl, url, headers)
|
||||
|
||||
|
||||
async def _delete_impl(url, headers):
|
||||
"""Internal delete implementation."""
|
||||
async with semaphore:
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.delete(url, headers=headers) as response:
|
||||
if response.status in [401, 429]:
|
||||
response_text = await response.text()
|
||||
raise Exception(
|
||||
f"Failed to delete {url}: {response.status} {response_text}"
|
||||
)
|
||||
return response.status
|
||||
|
||||
|
||||
async def batch_with_aiohttp(requests, headers):
|
||||
"""
|
||||
Execute multiple requests in a single batch call to Microsoft Graph API with throttling and retry logic.
|
||||
|
||||
Args:
|
||||
requests (list): List of request dictionaries with 'id', 'method', 'url', and optional 'body' keys.
|
||||
headers (dict): Headers including authentication.
|
||||
|
||||
Returns:
|
||||
dict: Batch response with individual request responses.
|
||||
"""
|
||||
return await _handle_throttling_retry(_batch_impl, requests, headers)
|
||||
|
||||
|
||||
async def _batch_impl(requests, headers):
|
||||
"""Internal batch implementation."""
|
||||
batch_url = "https://graph.microsoft.com/v1.0/$batch"
|
||||
batch_data = {"requests": requests}
|
||||
|
||||
async with semaphore:
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.post(
|
||||
batch_url, headers=headers, json=batch_data
|
||||
) as response:
|
||||
if response.status in [401, 429]:
|
||||
response_text = await response.text()
|
||||
raise Exception(
|
||||
f"Batch request failed: {response.status} {response_text}"
|
||||
)
|
||||
elif response.status != 200:
|
||||
raise Exception(
|
||||
f"Batch request failed: {response.status} {await response.text()}"
|
||||
)
|
||||
return await response.json()
|
||||
|
||||
@@ -5,14 +5,17 @@ Mail operations for Microsoft Graph API.
|
||||
import os
|
||||
import re
|
||||
import glob
|
||||
from typing import Set
|
||||
import aiohttp
|
||||
import asyncio
|
||||
from email.parser import Parser
|
||||
from email.utils import getaddresses
|
||||
from typing import List, Dict, Any
|
||||
|
||||
from .client import (
|
||||
fetch_with_aiohttp,
|
||||
patch_with_aiohttp,
|
||||
post_with_aiohttp,
|
||||
delete_with_aiohttp,
|
||||
batch_with_aiohttp,
|
||||
)
|
||||
|
||||
|
||||
@@ -41,7 +44,6 @@ async def fetch_mail_async(
|
||||
None
|
||||
"""
|
||||
from src.utils.mail_utils.maildir import save_mime_to_maildir_async
|
||||
from src.utils.mail_utils.helpers import truncate_id
|
||||
|
||||
mail_url = "https://graph.microsoft.com/v1.0/me/mailFolders/inbox/messages?$top=100&$orderby=receivedDateTime asc&$select=id,subject,from,toRecipients,ccRecipients,receivedDateTime,isRead"
|
||||
messages = []
|
||||
@@ -73,9 +75,18 @@ async def fetch_mail_async(
|
||||
new_files = set(glob.glob(os.path.join(new_dir, "*.eml*")))
|
||||
cur_files = set(glob.glob(os.path.join(cur_dir, "*.eml*")))
|
||||
|
||||
for filename in Set.union(cur_files, new_files):
|
||||
message_id = filename.split(".")[0].split("/")[
|
||||
-1
|
||||
# Get local message IDs (filename without extension)
|
||||
local_msg_ids = set()
|
||||
for filename in set.union(cur_files, new_files):
|
||||
message_id = os.path.basename(filename).split(".")[
|
||||
0
|
||||
] # Extract the Message-ID from the filename
|
||||
local_msg_ids.add(message_id)
|
||||
|
||||
# Delete local files that no longer exist on server
|
||||
for filename in set.union(cur_files, new_files):
|
||||
message_id = os.path.basename(filename).split(".")[
|
||||
0
|
||||
] # Extract the Message-ID from the filename
|
||||
if message_id not in inbox_msg_ids:
|
||||
if not dry_run:
|
||||
@@ -84,7 +95,18 @@ async def fetch_mail_async(
|
||||
else:
|
||||
progress.console.print(f"[DRY-RUN] Would delete {filename} from inbox")
|
||||
|
||||
for message in messages:
|
||||
# Filter messages to only include those not already local
|
||||
messages_to_download = [msg for msg in messages if msg["id"] not in local_msg_ids]
|
||||
|
||||
progress.console.print(
|
||||
f"Found {len(messages)} total messages on server, {len(local_msg_ids)} already local"
|
||||
)
|
||||
progress.console.print(f"Downloading {len(messages_to_download)} new messages")
|
||||
|
||||
# Update progress to reflect only the messages we actually need to download
|
||||
progress.update(task_id, total=len(messages_to_download), completed=0)
|
||||
|
||||
for message in messages_to_download:
|
||||
progress.console.print(
|
||||
f"Processing message: {message.get('subject', 'No Subject')}", end="\r"
|
||||
)
|
||||
@@ -97,14 +119,19 @@ async def fetch_mail_async(
|
||||
dry_run,
|
||||
download_attachments,
|
||||
)
|
||||
progress.update(task_id, advance=0.5)
|
||||
progress.update(task_id, completed=len(messages))
|
||||
progress.console.print(f"\nFinished saving {len(messages)} messages.")
|
||||
progress.update(task_id, advance=1)
|
||||
progress.update(task_id, completed=len(messages_to_download))
|
||||
progress.console.print(
|
||||
f"\nFinished downloading {len(messages_to_download)} new messages."
|
||||
)
|
||||
progress.console.print(
|
||||
f"Total messages on server: {len(messages)}, Already local: {len(local_msg_ids)}"
|
||||
)
|
||||
|
||||
|
||||
async def archive_mail_async(maildir_path, headers, progress, task_id, dry_run=False):
|
||||
"""
|
||||
Archive mail from Maildir to Microsoft Graph API archive folder.
|
||||
Archive mail from Maildir to Microsoft Graph API archive folder using batch operations.
|
||||
|
||||
Args:
|
||||
maildir_path (str): Path to the Maildir.
|
||||
@@ -125,8 +152,14 @@ async def archive_mail_async(maildir_path, headers, progress, task_id, dry_run=F
|
||||
glob.glob(os.path.join(archive_dir, "**", "*.eml*"), recursive=True)
|
||||
)
|
||||
|
||||
if not archive_files:
|
||||
progress.update(task_id, total=0, completed=0)
|
||||
progress.console.print("No messages to archive")
|
||||
return
|
||||
|
||||
progress.update(task_id, total=len(archive_files))
|
||||
|
||||
# Get archive folder ID from server
|
||||
folder_response = await fetch_with_aiohttp(
|
||||
"https://graph.microsoft.com/v1.0/me/mailFolders", headers
|
||||
)
|
||||
@@ -143,44 +176,115 @@ async def archive_mail_async(maildir_path, headers, progress, task_id, dry_run=F
|
||||
if not archive_folder_id:
|
||||
raise Exception("No folder named 'Archive' or 'Archives' found on the server.")
|
||||
|
||||
for filepath in archive_files:
|
||||
message_id = os.path.basename(filepath).split(".")[
|
||||
0
|
||||
] # Extract the Message-ID from the filename
|
||||
# Process files in batches of 20 (Microsoft Graph batch limit)
|
||||
batch_size = 20
|
||||
successful_moves = []
|
||||
|
||||
for i in range(0, len(archive_files), batch_size):
|
||||
batch_files = archive_files[i : i + batch_size]
|
||||
|
||||
# Add small delay between batches to respect API limits
|
||||
if i > 0:
|
||||
await asyncio.sleep(0.5)
|
||||
|
||||
if not dry_run:
|
||||
status = await post_with_aiohttp(
|
||||
f"https://graph.microsoft.com/v1.0/me/messages/{message_id}/microsoft.graph.move",
|
||||
headers,
|
||||
{"destinationId": archive_folder_id},
|
||||
)
|
||||
if status == 201: # 201 Created indicates successful move
|
||||
os.remove(
|
||||
filepath
|
||||
) # Remove the local file since it's now archived on server
|
||||
progress.console.print(f"Moved message to 'Archive': {message_id}")
|
||||
elif status == 404:
|
||||
os.remove(
|
||||
filepath
|
||||
) # Remove the file from local archive if not found on server
|
||||
progress.console.print(
|
||||
f"Message not found on server, removed local copy: {message_id}"
|
||||
)
|
||||
else:
|
||||
progress.console.print(
|
||||
f"Failed to move message to 'Archive': {message_id}, status: {status}"
|
||||
# Prepare batch requests
|
||||
batch_requests = []
|
||||
for idx, filepath in enumerate(batch_files):
|
||||
message_id = os.path.basename(filepath).split(".")[0]
|
||||
batch_requests.append(
|
||||
{
|
||||
"id": str(idx + 1),
|
||||
"method": "POST",
|
||||
"url": f"/me/messages/{message_id}/microsoft.graph.move",
|
||||
"body": {"destinationId": archive_folder_id},
|
||||
"headers": {"Content-Type": "application/json"},
|
||||
}
|
||||
)
|
||||
|
||||
try:
|
||||
# Execute batch request
|
||||
batch_response = await batch_with_aiohttp(batch_requests, headers)
|
||||
|
||||
# Process batch results
|
||||
for response in batch_response.get("responses", []):
|
||||
request_id = (
|
||||
int(response["id"]) - 1
|
||||
) # Convert back to 0-based index
|
||||
filepath = batch_files[request_id]
|
||||
message_id = os.path.basename(filepath).split(".")[0]
|
||||
status = response["status"]
|
||||
|
||||
if status == 201: # 201 Created indicates successful move
|
||||
os.remove(
|
||||
filepath
|
||||
) # Remove the local file since it's now archived on server
|
||||
successful_moves.append(message_id)
|
||||
progress.console.print(
|
||||
f"Moved message to 'Archive': {message_id}"
|
||||
)
|
||||
elif status == 404:
|
||||
os.remove(
|
||||
filepath
|
||||
) # Remove the file from local archive if not found on server
|
||||
progress.console.print(
|
||||
f"Message not found on server, removed local copy: {message_id}"
|
||||
)
|
||||
else:
|
||||
progress.console.print(
|
||||
f"Failed to move message to 'Archive': {message_id}, status: {status}"
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
progress.console.print(f"Batch archive request failed: {str(e)}")
|
||||
# Fall back to individual requests for this batch
|
||||
for filepath in batch_files:
|
||||
message_id = os.path.basename(filepath).split(".")[0]
|
||||
try:
|
||||
status = await post_with_aiohttp(
|
||||
f"https://graph.microsoft.com/v1.0/me/messages/{message_id}/microsoft.graph.move",
|
||||
headers,
|
||||
{"destinationId": archive_folder_id},
|
||||
)
|
||||
if status == 201:
|
||||
os.remove(filepath)
|
||||
successful_moves.append(message_id)
|
||||
progress.console.print(
|
||||
f"Moved message to 'Archive' (fallback): {message_id}"
|
||||
)
|
||||
elif status == 404:
|
||||
os.remove(filepath)
|
||||
progress.console.print(
|
||||
f"Message not found on server, removed local copy: {message_id}"
|
||||
)
|
||||
else:
|
||||
progress.console.print(
|
||||
f"Failed to move message to 'Archive': {message_id}, status: {status}"
|
||||
)
|
||||
except Exception as individual_error:
|
||||
progress.console.print(
|
||||
f"Failed to archive {message_id}: {str(individual_error)}"
|
||||
)
|
||||
else:
|
||||
progress.console.print(
|
||||
f"[DRY-RUN] Would move message to 'Archive' folder: {message_id}"
|
||||
)
|
||||
progress.advance(task_id)
|
||||
# Dry run - just log what would be done
|
||||
for filepath in batch_files:
|
||||
message_id = os.path.basename(filepath).split(".")[0]
|
||||
progress.console.print(
|
||||
f"[DRY-RUN] Would move message to 'Archive' folder: {message_id}"
|
||||
)
|
||||
|
||||
progress.advance(task_id, len(batch_files))
|
||||
|
||||
if not dry_run:
|
||||
progress.console.print(
|
||||
f"Successfully archived {len(successful_moves)} messages in batches"
|
||||
)
|
||||
return
|
||||
|
||||
|
||||
async def delete_mail_async(maildir_path, headers, progress, task_id, dry_run=False):
|
||||
"""
|
||||
Delete mail from Maildir and Microsoft Graph API.
|
||||
Delete mail from Maildir and Microsoft Graph API using batch operations.
|
||||
|
||||
Args:
|
||||
maildir_path (str): Path to the Maildir.
|
||||
@@ -194,22 +298,99 @@ async def delete_mail_async(maildir_path, headers, progress, task_id, dry_run=Fa
|
||||
"""
|
||||
trash_dir = os.path.join(maildir_path, ".Trash", "cur")
|
||||
trash_files = set(glob.glob(os.path.join(trash_dir, "*.eml*")))
|
||||
|
||||
if not trash_files:
|
||||
progress.update(task_id, total=0, completed=0)
|
||||
progress.console.print("No messages to delete")
|
||||
return
|
||||
|
||||
progress.update(task_id, total=len(trash_files))
|
||||
|
||||
for filepath in trash_files:
|
||||
message_id = os.path.basename(filepath).split(".")[
|
||||
0
|
||||
] # Extract the Message-ID from the filename
|
||||
# Process files in batches of 20 (Microsoft Graph batch limit)
|
||||
batch_size = 20
|
||||
trash_files_list = list(trash_files)
|
||||
successful_deletes = []
|
||||
|
||||
for i in range(0, len(trash_files_list), batch_size):
|
||||
batch_files = trash_files_list[i : i + batch_size]
|
||||
|
||||
# Add small delay between batches to respect API limits
|
||||
if i > 0:
|
||||
await asyncio.sleep(0.5)
|
||||
|
||||
if not dry_run:
|
||||
progress.console.print(f"Moving message to trash: {message_id}")
|
||||
status = await delete_with_aiohttp(
|
||||
f"https://graph.microsoft.com/v1.0/me/messages/{message_id}", headers
|
||||
)
|
||||
if status == 204 or status == 404:
|
||||
os.remove(filepath) # Remove the file from local trash
|
||||
# Prepare batch requests
|
||||
batch_requests = []
|
||||
for idx, filepath in enumerate(batch_files):
|
||||
message_id = os.path.basename(filepath).split(".")[0]
|
||||
batch_requests.append(
|
||||
{
|
||||
"id": str(idx + 1),
|
||||
"method": "DELETE",
|
||||
"url": f"/me/messages/{message_id}",
|
||||
}
|
||||
)
|
||||
|
||||
try:
|
||||
# Execute batch request
|
||||
batch_response = await batch_with_aiohttp(batch_requests, headers)
|
||||
|
||||
# Process batch results
|
||||
for response in batch_response.get("responses", []):
|
||||
request_id = (
|
||||
int(response["id"]) - 1
|
||||
) # Convert back to 0-based index
|
||||
filepath = batch_files[request_id]
|
||||
message_id = os.path.basename(filepath).split(".")[0]
|
||||
status = response["status"]
|
||||
|
||||
if (
|
||||
status == 204 or status == 404
|
||||
): # 204 No Content or 404 Not Found (already deleted)
|
||||
os.remove(filepath) # Remove the file from local trash
|
||||
successful_deletes.append(message_id)
|
||||
progress.console.print(f"Deleted message: {message_id}")
|
||||
else:
|
||||
progress.console.print(
|
||||
f"Failed to delete message: {message_id}, status: {status}"
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
progress.console.print(f"Batch delete request failed: {str(e)}")
|
||||
# Fall back to individual requests for this batch
|
||||
for filepath in batch_files:
|
||||
message_id = os.path.basename(filepath).split(".")[0]
|
||||
try:
|
||||
status = await delete_with_aiohttp(
|
||||
f"https://graph.microsoft.com/v1.0/me/messages/{message_id}",
|
||||
headers,
|
||||
)
|
||||
if status == 204 or status == 404:
|
||||
os.remove(filepath)
|
||||
successful_deletes.append(message_id)
|
||||
progress.console.print(
|
||||
f"Deleted message (fallback): {message_id}"
|
||||
)
|
||||
else:
|
||||
progress.console.print(
|
||||
f"Failed to delete message: {message_id}, status: {status}"
|
||||
)
|
||||
except Exception as individual_error:
|
||||
progress.console.print(
|
||||
f"Failed to delete {message_id}: {str(individual_error)}"
|
||||
)
|
||||
else:
|
||||
progress.console.print(f"[DRY-RUN] Would delete message: {message_id}")
|
||||
progress.advance(task_id)
|
||||
# Dry run - just log what would be done
|
||||
for filepath in batch_files:
|
||||
message_id = os.path.basename(filepath).split(".")[0]
|
||||
progress.console.print(f"[DRY-RUN] Would delete message: {message_id}")
|
||||
|
||||
progress.advance(task_id, len(batch_files))
|
||||
|
||||
if not dry_run:
|
||||
progress.console.print(
|
||||
f"Successfully deleted {len(successful_deletes)} messages in batches"
|
||||
)
|
||||
|
||||
|
||||
async def get_inbox_count_async(headers):
|
||||
@@ -231,7 +412,7 @@ async def synchronize_maildir_async(
|
||||
maildir_path, headers, progress, task_id, dry_run=False
|
||||
):
|
||||
"""
|
||||
Synchronize Maildir with Microsoft Graph API.
|
||||
Synchronize Maildir with Microsoft Graph API using batch operations.
|
||||
|
||||
Args:
|
||||
maildir_path (str): Path to the Maildir.
|
||||
@@ -258,32 +439,396 @@ async def synchronize_maildir_async(
|
||||
cur_files = set(glob.glob(os.path.join(cur_dir, "*.eml*")))
|
||||
|
||||
moved_to_cur = [os.path.basename(f) for f in cur_files - new_files]
|
||||
progress.update(task_id, total=len(moved_to_cur))
|
||||
for filename in moved_to_cur:
|
||||
# TODO: this isn't scalable, we should use a more efficient way to check if the file was modified
|
||||
if os.path.getmtime(os.path.join(cur_dir, filename)) < last_sync:
|
||||
progress.update(task_id, advance=1)
|
||||
continue
|
||||
message_id = re.sub(
|
||||
r"\:2.+", "", filename.split(".")[0]
|
||||
) # Extract the Message-ID from the filename
|
||||
if not dry_run:
|
||||
status = await patch_with_aiohttp(
|
||||
f"https://graph.microsoft.com/v1.0/me/messages/{message_id}",
|
||||
headers,
|
||||
{"isRead": True},
|
||||
)
|
||||
if status == 404:
|
||||
os.remove(os.path.join(cur_dir, filename))
|
||||
|
||||
# Filter out files that haven't been modified since last sync
|
||||
files_to_process = []
|
||||
for filename in moved_to_cur:
|
||||
if os.path.getmtime(os.path.join(cur_dir, filename)) >= last_sync:
|
||||
files_to_process.append(filename)
|
||||
|
||||
if not files_to_process:
|
||||
progress.update(task_id, total=0, completed=0)
|
||||
progress.console.print("No messages to mark as read")
|
||||
# Save timestamp even if no work was done
|
||||
if not dry_run:
|
||||
save_sync_timestamp()
|
||||
return
|
||||
|
||||
progress.update(task_id, total=len(files_to_process))
|
||||
|
||||
# Process files in batches of 20 (Microsoft Graph batch limit)
|
||||
batch_size = 20
|
||||
successful_reads = []
|
||||
|
||||
for i in range(0, len(files_to_process), batch_size):
|
||||
batch_files = files_to_process[i : i + batch_size]
|
||||
|
||||
# Add small delay between batches to respect API limits
|
||||
if i > 0:
|
||||
await asyncio.sleep(0.5)
|
||||
|
||||
if not dry_run:
|
||||
# Prepare batch requests
|
||||
batch_requests = []
|
||||
for idx, filename in enumerate(batch_files):
|
||||
message_id = re.sub(r"\:2.+", "", filename.split(".")[0])
|
||||
batch_requests.append(
|
||||
{
|
||||
"id": str(idx + 1),
|
||||
"method": "PATCH",
|
||||
"url": f"/me/messages/{message_id}",
|
||||
"body": {"isRead": True},
|
||||
"headers": {"Content-Type": "application/json"},
|
||||
}
|
||||
)
|
||||
|
||||
try:
|
||||
# Execute batch request
|
||||
batch_response = await batch_with_aiohttp(batch_requests, headers)
|
||||
|
||||
# Process batch results
|
||||
for response in batch_response.get("responses", []):
|
||||
request_id = (
|
||||
int(response["id"]) - 1
|
||||
) # Convert back to 0-based index
|
||||
filename = batch_files[request_id]
|
||||
message_id = re.sub(r"\:2.+", "", filename.split(".")[0])
|
||||
status = response["status"]
|
||||
|
||||
if status == 200: # 200 OK indicates successful update
|
||||
successful_reads.append(message_id)
|
||||
progress.console.print(
|
||||
f"Marked message as read: {truncate_id(message_id)}"
|
||||
)
|
||||
elif status == 404:
|
||||
os.remove(
|
||||
os.path.join(cur_dir, filename)
|
||||
) # Remove file if message doesn't exist on server
|
||||
progress.console.print(
|
||||
f"Message not found on server, removed local copy: {truncate_id(message_id)}"
|
||||
)
|
||||
else:
|
||||
progress.console.print(
|
||||
f"Failed to mark message as read: {truncate_id(message_id)}, status: {status}"
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
progress.console.print(f"Batch read-status request failed: {str(e)}")
|
||||
# Fall back to individual requests for this batch
|
||||
for filename in batch_files:
|
||||
message_id = re.sub(r"\:2.+", "", filename.split(".")[0])
|
||||
try:
|
||||
status = await patch_with_aiohttp(
|
||||
f"https://graph.microsoft.com/v1.0/me/messages/{message_id}",
|
||||
headers,
|
||||
{"isRead": True},
|
||||
)
|
||||
if status == 200:
|
||||
successful_reads.append(message_id)
|
||||
progress.console.print(
|
||||
f"Marked message as read (fallback): {truncate_id(message_id)}"
|
||||
)
|
||||
elif status == 404:
|
||||
os.remove(os.path.join(cur_dir, filename))
|
||||
progress.console.print(
|
||||
f"Message not found on server, removed local copy: {truncate_id(message_id)}"
|
||||
)
|
||||
else:
|
||||
progress.console.print(
|
||||
f"Failed to mark message as read: {truncate_id(message_id)}, status: {status}"
|
||||
)
|
||||
except Exception as individual_error:
|
||||
progress.console.print(
|
||||
f"Failed to update read status for {truncate_id(message_id)}: {str(individual_error)}"
|
||||
)
|
||||
else:
|
||||
progress.console.print(
|
||||
f"[DRY-RUN] Would mark message as read: {truncate_id(message_id)}"
|
||||
)
|
||||
progress.advance(task_id)
|
||||
# Dry run - just log what would be done
|
||||
for filename in batch_files:
|
||||
message_id = re.sub(r"\:2.+", "", filename.split(".")[0])
|
||||
progress.console.print(
|
||||
f"[DRY-RUN] Would mark message as read: {truncate_id(message_id)}"
|
||||
)
|
||||
|
||||
progress.advance(task_id, len(batch_files))
|
||||
|
||||
# Save the current sync timestamp
|
||||
if not dry_run:
|
||||
save_sync_timestamp()
|
||||
progress.console.print(
|
||||
f"Successfully marked {len(successful_reads)} messages as read in batches"
|
||||
)
|
||||
else:
|
||||
progress.console.print("[DRY-RUN] Would save sync timestamp.")
|
||||
|
||||
|
||||
def parse_email_for_graph_api(email_content: str) -> Dict[str, Any]:
|
||||
"""
|
||||
Parse email content and convert to Microsoft Graph API message format.
|
||||
|
||||
Args:
|
||||
email_content: Raw email content (RFC 5322 format)
|
||||
|
||||
Returns:
|
||||
Dictionary formatted for Microsoft Graph API send message
|
||||
"""
|
||||
parser = Parser()
|
||||
msg = parser.parsestr(email_content)
|
||||
|
||||
# Parse recipients
|
||||
def parse_recipients(header_value: str) -> List[Dict[str, Any]]:
|
||||
if not header_value:
|
||||
return []
|
||||
addresses = getaddresses([header_value])
|
||||
return [
|
||||
{"emailAddress": {"address": addr, "name": name if name else addr}}
|
||||
for name, addr in addresses
|
||||
if addr
|
||||
]
|
||||
|
||||
to_recipients = parse_recipients(msg.get("To", ""))
|
||||
cc_recipients = parse_recipients(msg.get("Cc", ""))
|
||||
bcc_recipients = parse_recipients(msg.get("Bcc", ""))
|
||||
|
||||
# Get body content
|
||||
body_content = ""
|
||||
body_type = "text"
|
||||
|
||||
if msg.is_multipart():
|
||||
for part in msg.walk():
|
||||
if part.get_content_type() == "text/plain":
|
||||
body_content = part.get_payload(decode=True).decode(
|
||||
"utf-8", errors="ignore"
|
||||
)
|
||||
body_type = "text"
|
||||
break
|
||||
elif part.get_content_type() == "text/html":
|
||||
body_content = part.get_payload(decode=True).decode(
|
||||
"utf-8", errors="ignore"
|
||||
)
|
||||
body_type = "html"
|
||||
else:
|
||||
body_content = msg.get_payload(decode=True).decode("utf-8", errors="ignore")
|
||||
if msg.get_content_type() == "text/html":
|
||||
body_type = "html"
|
||||
|
||||
# Build Graph API message
|
||||
message = {
|
||||
"subject": msg.get("Subject", ""),
|
||||
"body": {"contentType": body_type, "content": body_content},
|
||||
"toRecipients": to_recipients,
|
||||
"ccRecipients": cc_recipients,
|
||||
"bccRecipients": bcc_recipients,
|
||||
}
|
||||
|
||||
# Add reply-to if present
|
||||
reply_to = msg.get("Reply-To", "")
|
||||
if reply_to:
|
||||
message["replyTo"] = parse_recipients(reply_to)
|
||||
|
||||
return message
|
||||
|
||||
|
||||
async def send_email_async(
|
||||
email_content: str, headers: Dict[str, str], dry_run: bool = False
|
||||
) -> bool:
|
||||
"""
|
||||
Send email using Microsoft Graph API.
|
||||
|
||||
Args:
|
||||
email_content: Raw email content (RFC 5322 format)
|
||||
headers: Authentication headers for Microsoft Graph API
|
||||
dry_run: If True, don't actually send the email
|
||||
|
||||
Returns:
|
||||
True if email was sent successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
# Parse email content for Graph API
|
||||
message_data = parse_email_for_graph_api(email_content)
|
||||
|
||||
if dry_run:
|
||||
print(f"[DRY-RUN] Would send email: {message_data['subject']}")
|
||||
print(
|
||||
f"[DRY-RUN] To: {[r['emailAddress']['address'] for r in message_data['toRecipients']]}"
|
||||
)
|
||||
return True
|
||||
|
||||
# Send email via Graph API
|
||||
send_url = "https://graph.microsoft.com/v1.0/me/sendMail"
|
||||
|
||||
# Log attempt
|
||||
import logging
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
|
||||
handlers=[
|
||||
logging.FileHandler(
|
||||
os.path.expanduser("~/Mail/sendmail.log"), mode="a"
|
||||
),
|
||||
],
|
||||
)
|
||||
logging.info(
|
||||
f"Attempting to send email: {message_data['subject']} to {[r['emailAddress']['address'] for r in message_data['toRecipients']]}"
|
||||
)
|
||||
|
||||
response = await post_with_aiohttp(send_url, headers, {"message": message_data})
|
||||
|
||||
# Microsoft Graph sendMail returns 202 Accepted on success
|
||||
if response == 202:
|
||||
logging.info(f"Successfully sent email: {message_data['subject']}")
|
||||
return True
|
||||
else:
|
||||
logging.error(
|
||||
f"Unexpected response code {response} when sending email: {message_data['subject']}"
|
||||
)
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
import logging
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
|
||||
handlers=[
|
||||
logging.FileHandler(
|
||||
os.path.expanduser("~/Mail/sendmail.log"), mode="a"
|
||||
),
|
||||
],
|
||||
)
|
||||
logging.error(f"Exception sending email: {e}", exc_info=True)
|
||||
print(f"Error sending email: {e}")
|
||||
return False
|
||||
|
||||
|
||||
async def process_outbox_async(
|
||||
maildir_path: str,
|
||||
org: str,
|
||||
headers: Dict[str, str],
|
||||
progress,
|
||||
task_id,
|
||||
dry_run: bool = False,
|
||||
) -> tuple[int, int]:
|
||||
"""
|
||||
Process outbound emails in the outbox queue.
|
||||
|
||||
Args:
|
||||
maildir_path: Base maildir path
|
||||
org: Organization name
|
||||
headers: Authentication headers for Microsoft Graph API
|
||||
progress: Progress instance for updating progress bars
|
||||
task_id: ID of the task in the progress bar
|
||||
dry_run: If True, don't actually send emails
|
||||
|
||||
Returns:
|
||||
Tuple of (successful_sends, failed_sends)
|
||||
"""
|
||||
outbox_path = os.path.join(maildir_path, org, "outbox")
|
||||
new_dir = os.path.join(outbox_path, "new")
|
||||
cur_dir = os.path.join(outbox_path, "cur")
|
||||
failed_dir = os.path.join(outbox_path, "failed")
|
||||
|
||||
# Ensure directories exist
|
||||
from src.utils.mail_utils.helpers import ensure_directory_exists
|
||||
|
||||
ensure_directory_exists(failed_dir)
|
||||
|
||||
# Get pending emails
|
||||
pending_emails = []
|
||||
if os.path.exists(new_dir):
|
||||
pending_emails = [f for f in os.listdir(new_dir) if not f.startswith(".")]
|
||||
|
||||
if not pending_emails:
|
||||
progress.update(task_id, total=0, completed=0)
|
||||
return 0, 0
|
||||
|
||||
progress.update(task_id, total=len(pending_emails))
|
||||
progress.console.print(
|
||||
f"Processing {len(pending_emails)} outbound emails for {org}"
|
||||
)
|
||||
|
||||
successful_sends = 0
|
||||
failed_sends = 0
|
||||
|
||||
for email_file in pending_emails:
|
||||
email_path = os.path.join(new_dir, email_file)
|
||||
|
||||
try:
|
||||
# Read email content
|
||||
with open(email_path, "r", encoding="utf-8") as f:
|
||||
email_content = f.read()
|
||||
|
||||
# Send email
|
||||
if await send_email_async(email_content, headers, dry_run):
|
||||
# Move to cur directory on success
|
||||
if not dry_run:
|
||||
cur_path = os.path.join(cur_dir, email_file)
|
||||
os.rename(email_path, cur_path)
|
||||
progress.console.print(f"✓ Sent email: {email_file}")
|
||||
else:
|
||||
progress.console.print(f"[DRY-RUN] Would send email: {email_file}")
|
||||
successful_sends += 1
|
||||
else:
|
||||
# Move to failed directory on failure
|
||||
if not dry_run:
|
||||
failed_path = os.path.join(failed_dir, email_file)
|
||||
os.rename(email_path, failed_path)
|
||||
progress.console.print(f"✗ Failed to send email: {email_file}")
|
||||
|
||||
# Log the failure
|
||||
import logging
|
||||
|
||||
logging.error(f"Failed to send email: {email_file}")
|
||||
|
||||
# Send notification about failure
|
||||
from src.utils.notifications import send_notification
|
||||
|
||||
parser = Parser()
|
||||
msg = parser.parsestr(email_content)
|
||||
subject = msg.get("Subject", "Unknown")
|
||||
send_notification(
|
||||
title="Email Send Failed",
|
||||
message=f"Failed to send: {subject}",
|
||||
subtitle=f"Check {failed_dir}",
|
||||
sound="default",
|
||||
)
|
||||
failed_sends += 1
|
||||
|
||||
except Exception as e:
|
||||
progress.console.print(f"✗ Error processing {email_file}: {e}")
|
||||
if not dry_run:
|
||||
# Move to failed directory
|
||||
failed_path = os.path.join(failed_dir, email_file)
|
||||
try:
|
||||
os.rename(email_path, failed_path)
|
||||
except (OSError, FileNotFoundError):
|
||||
pass # File might already be moved or deleted
|
||||
failed_sends += 1
|
||||
|
||||
progress.advance(task_id, 1)
|
||||
|
||||
if not dry_run and successful_sends > 0:
|
||||
progress.console.print(f"✓ Successfully sent {successful_sends} emails")
|
||||
|
||||
# Send success notification
|
||||
from src.utils.notifications import send_notification
|
||||
|
||||
if successful_sends == 1:
|
||||
send_notification(
|
||||
title="Email Sent",
|
||||
message="1 email sent successfully",
|
||||
subtitle=f"from {org}",
|
||||
sound="default",
|
||||
)
|
||||
else:
|
||||
send_notification(
|
||||
title="Emails Sent",
|
||||
message=f"{successful_sends} emails sent successfully",
|
||||
subtitle=f"from {org}",
|
||||
sound="default",
|
||||
)
|
||||
|
||||
if failed_sends > 0:
|
||||
progress.console.print(f"✗ Failed to send {failed_sends} emails")
|
||||
|
||||
return successful_sends, failed_sends
|
||||
|
||||
8
src/services/ticktick/__init__.py
Normal file
8
src/services/ticktick/__init__.py
Normal file
@@ -0,0 +1,8 @@
|
||||
"""
|
||||
TickTick API service module.
|
||||
"""
|
||||
|
||||
from .client import TickTickService
|
||||
from .auth import get_ticktick_client
|
||||
|
||||
__all__ = ["TickTickService", "get_ticktick_client"]
|
||||
354
src/services/ticktick/auth.py
Normal file
354
src/services/ticktick/auth.py
Normal file
@@ -0,0 +1,354 @@
|
||||
"""
|
||||
Authentication module for TickTick API.
|
||||
"""
|
||||
|
||||
import os
|
||||
import json
|
||||
import logging
|
||||
import ssl
|
||||
import certifi
|
||||
from pathlib import Path
|
||||
from typing import Optional, Dict, Any
|
||||
from ticktick.oauth2 import OAuth2
|
||||
from ticktick.api import TickTickClient
|
||||
|
||||
# Suppress verbose logging from TickTick library
|
||||
logging.getLogger("ticktick").setLevel(logging.ERROR)
|
||||
logging.getLogger("requests").setLevel(logging.ERROR)
|
||||
logging.getLogger("urllib3").setLevel(logging.ERROR)
|
||||
|
||||
# Project name for token storage
|
||||
PROJECT_NAME = "gtd-terminal-tools"
|
||||
|
||||
|
||||
def get_token_directory() -> Path:
|
||||
"""Get the directory where TickTick tokens are stored."""
|
||||
token_dir = Path.home() / ".local" / "share" / PROJECT_NAME
|
||||
token_dir.mkdir(parents=True, exist_ok=True)
|
||||
return token_dir
|
||||
|
||||
|
||||
def get_token_file_path() -> Path:
|
||||
"""Get the full path to the TickTick token file."""
|
||||
return get_token_directory() / "ticktick_tokens.json"
|
||||
|
||||
|
||||
def load_ticktick_credentials() -> Dict[str, str]:
|
||||
"""
|
||||
Load TickTick OAuth credentials from environment variables.
|
||||
|
||||
Returns:
|
||||
Dict with client_id, client_secret, and redirect_uri
|
||||
|
||||
Raises:
|
||||
ValueError: If required environment variables are missing
|
||||
"""
|
||||
client_id = os.getenv("TICKTICK_CLIENT_ID")
|
||||
client_secret = os.getenv("TICKTICK_CLIENT_SECRET")
|
||||
redirect_uri = os.getenv("TICKTICK_REDIRECT_URI", "http://localhost:8080")
|
||||
|
||||
if not client_id or not client_secret:
|
||||
raise ValueError(
|
||||
"Please set TICKTICK_CLIENT_ID and TICKTICK_CLIENT_SECRET environment variables.\n"
|
||||
"Register your app at: https://developer.ticktick.com/docs#/openapi"
|
||||
)
|
||||
|
||||
return {
|
||||
"client_id": client_id,
|
||||
"client_secret": client_secret,
|
||||
"redirect_uri": redirect_uri,
|
||||
}
|
||||
|
||||
|
||||
def load_stored_tokens() -> Optional[Dict[str, Any]]:
|
||||
"""
|
||||
Load stored OAuth tokens from the token file.
|
||||
|
||||
Returns:
|
||||
Token data dict if file exists and is valid, None otherwise
|
||||
"""
|
||||
token_file = get_token_file_path()
|
||||
if not token_file.exists():
|
||||
return None
|
||||
|
||||
try:
|
||||
with open(token_file, "r") as f:
|
||||
tokens = json.load(f)
|
||||
return tokens
|
||||
except (json.JSONDecodeError, OSError) as e:
|
||||
logging.warning(f"Failed to load token file {token_file}: {e}")
|
||||
return None
|
||||
|
||||
|
||||
def check_token_validity() -> Dict[str, Any]:
|
||||
"""
|
||||
Check the validity of stored OAuth tokens.
|
||||
|
||||
Returns:
|
||||
Dict with 'valid', 'expires_at', 'expires_in_hours' keys
|
||||
"""
|
||||
tokens = load_stored_tokens()
|
||||
if not tokens:
|
||||
return {"valid": False, "reason": "No tokens found"}
|
||||
|
||||
# Check if we have required token fields (ticktick-py format)
|
||||
if not tokens.get("access_token"):
|
||||
return {"valid": False, "reason": "Missing access token"}
|
||||
|
||||
# Check expiration using ticktick-py's expire_time field
|
||||
if "expire_time" in tokens:
|
||||
import datetime, time
|
||||
|
||||
try:
|
||||
# expire_time is a Unix timestamp
|
||||
expires_at = datetime.datetime.fromtimestamp(tokens["expire_time"])
|
||||
now = datetime.datetime.now()
|
||||
|
||||
# ticktick-py considers token expired if less than 60 seconds remain
|
||||
time_left = (expires_at - now).total_seconds()
|
||||
if time_left < 60:
|
||||
return {
|
||||
"valid": False,
|
||||
"reason": "Token expired or expiring soon",
|
||||
"expires_at": expires_at.isoformat(),
|
||||
}
|
||||
else:
|
||||
hours_left = time_left / 3600
|
||||
return {
|
||||
"valid": True,
|
||||
"expires_at": expires_at.isoformat(),
|
||||
"expires_in_hours": round(hours_left, 1),
|
||||
}
|
||||
except (ValueError, TypeError) as e:
|
||||
return {"valid": False, "reason": f"Invalid expiration format: {e}"}
|
||||
|
||||
# Check readable_expire_time if available
|
||||
if "readable_expire_time" in tokens:
|
||||
return {
|
||||
"valid": True,
|
||||
"reason": f"Token found (expires: {tokens['readable_expire_time']})",
|
||||
}
|
||||
|
||||
# If no expiration info, assume valid but warn
|
||||
return {"valid": True, "reason": "Token found (no expiration info)"}
|
||||
|
||||
|
||||
def save_tokens(tokens: Dict[str, Any]) -> None:
|
||||
"""
|
||||
Save OAuth tokens to the token file.
|
||||
|
||||
Args:
|
||||
tokens: Token data to save
|
||||
"""
|
||||
token_file = get_token_file_path()
|
||||
try:
|
||||
with open(token_file, "w") as f:
|
||||
json.dump(tokens, f, indent=2)
|
||||
except OSError as e:
|
||||
logging.error(f"Failed to save tokens to {token_file}: {e}")
|
||||
|
||||
|
||||
def create_oauth_client(use_custom_cache: bool = True) -> OAuth2:
|
||||
"""
|
||||
Create a TickTick OAuth2 client with custom token cache location.
|
||||
|
||||
Args:
|
||||
use_custom_cache: Whether to use custom cache path in ~/.local/share
|
||||
|
||||
Returns:
|
||||
OAuth2 client instance
|
||||
"""
|
||||
credentials = load_ticktick_credentials()
|
||||
|
||||
cache_path = str(get_token_file_path()) if use_custom_cache else ".token-oauth"
|
||||
|
||||
# Check if SSL verification should be disabled (for corporate MITM proxies)
|
||||
disable_ssl = os.getenv("TICKTICK_DISABLE_SSL_VERIFY", "").lower() in (
|
||||
"true",
|
||||
"1",
|
||||
"yes",
|
||||
)
|
||||
|
||||
# Create a session with SSL handling
|
||||
import requests
|
||||
from requests.adapters import HTTPAdapter
|
||||
from urllib3.util.retry import Retry
|
||||
|
||||
session = requests.Session()
|
||||
|
||||
if disable_ssl:
|
||||
# Disable SSL verification for corporate MITM environments
|
||||
session.verify = False
|
||||
# Suppress SSL warnings
|
||||
import urllib3
|
||||
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
logging.info(
|
||||
"SSL verification disabled for TickTick API (corporate proxy detected)"
|
||||
)
|
||||
else:
|
||||
# Use proper SSL certificate verification
|
||||
session.verify = certifi.where()
|
||||
os.environ["SSL_CERT_FILE"] = certifi.where()
|
||||
os.environ["REQUESTS_CA_BUNDLE"] = certifi.where()
|
||||
|
||||
# Add retry strategy
|
||||
retry_strategy = Retry(
|
||||
total=3,
|
||||
backoff_factor=1,
|
||||
status_forcelist=[429, 500, 502, 503, 504],
|
||||
)
|
||||
adapter = HTTPAdapter(max_retries=retry_strategy)
|
||||
session.mount("http://", adapter)
|
||||
session.mount("https://", adapter)
|
||||
|
||||
oauth_client = OAuth2(
|
||||
client_id=credentials["client_id"],
|
||||
client_secret=credentials["client_secret"],
|
||||
redirect_uri=credentials["redirect_uri"],
|
||||
cache_path=cache_path,
|
||||
scope="tasks:write tasks:read",
|
||||
session=session,
|
||||
)
|
||||
|
||||
return oauth_client
|
||||
|
||||
|
||||
def get_ticktick_client(
|
||||
username: Optional[str] = None, password: Optional[str] = None
|
||||
) -> TickTickClient:
|
||||
"""
|
||||
Get an authenticated TickTick client.
|
||||
|
||||
Note: The ticktick-py library requires both OAuth2 credentials AND
|
||||
username/password for initial session setup. This is how the library works.
|
||||
|
||||
Args:
|
||||
username: TickTick username (will prompt if not provided)
|
||||
password: TickTick password (will prompt if not provided)
|
||||
|
||||
Returns:
|
||||
Authenticated TickTickClient instance
|
||||
|
||||
Raises:
|
||||
ValueError: If OAuth credentials are invalid
|
||||
RuntimeError: If authentication fails
|
||||
"""
|
||||
# First check OAuth credentials
|
||||
try:
|
||||
oauth_client = create_oauth_client()
|
||||
except ValueError as e:
|
||||
raise ValueError(f"OAuth setup failed: {str(e)}")
|
||||
|
||||
# Get username/password
|
||||
if not username:
|
||||
username = os.getenv("TICKTICK_USERNAME")
|
||||
if not username:
|
||||
print("\n" + "=" * 50)
|
||||
print("TickTick Authentication Required")
|
||||
print("=" * 50)
|
||||
print("The TickTick library requires your login credentials")
|
||||
print("in addition to OAuth2 for initial session setup.")
|
||||
print("Your credentials are used only for authentication")
|
||||
print("and are not stored permanently.")
|
||||
print("=" * 50 + "\n")
|
||||
username = input("TickTick Username/Email: ")
|
||||
|
||||
if not password:
|
||||
password = os.getenv("TICKTICK_PASSWORD")
|
||||
if not password:
|
||||
import getpass
|
||||
|
||||
password = getpass.getpass("TickTick Password: ")
|
||||
|
||||
# Debug OAuth token status before attempting login
|
||||
logging.debug(f"OAuth client cache path: {oauth_client.cache_path}")
|
||||
if hasattr(oauth_client, "access_token_info") and oauth_client.access_token_info:
|
||||
logging.debug("OAuth token is available and cached")
|
||||
else:
|
||||
logging.debug("OAuth token may need to be retrieved")
|
||||
|
||||
try:
|
||||
# Enable more detailed logging for the API call
|
||||
logging.getLogger("ticktick").setLevel(logging.DEBUG)
|
||||
logging.getLogger("requests").setLevel(logging.DEBUG)
|
||||
|
||||
logging.info(
|
||||
f"Attempting to create TickTick client with username: {username[:3]}***"
|
||||
)
|
||||
client = TickTickClient(username, password, oauth_client)
|
||||
|
||||
# Restore logging levels
|
||||
logging.getLogger("ticktick").setLevel(logging.ERROR)
|
||||
logging.getLogger("requests").setLevel(logging.ERROR)
|
||||
|
||||
# Test the client by making a simple API call
|
||||
try:
|
||||
# Try to get user info or projects to verify the client works
|
||||
projects = client.get_by_fields(search="projects")
|
||||
logging.info("TickTick client initialized and tested successfully")
|
||||
except Exception as test_e:
|
||||
logging.warning(f"Client created but API test failed: {test_e}")
|
||||
# Don't fail here, just log the warning
|
||||
|
||||
return client
|
||||
except Exception as e:
|
||||
# Restore logging levels in case of error
|
||||
logging.getLogger("ticktick").setLevel(logging.ERROR)
|
||||
logging.getLogger("requests").setLevel(logging.ERROR)
|
||||
|
||||
error_msg = str(e)
|
||||
logging.error(f"TickTick client initialization failed: {error_msg}")
|
||||
|
||||
# Provide more detailed error messages
|
||||
if "login" in error_msg.lower():
|
||||
raise RuntimeError(
|
||||
f"Login failed: {error_msg}\n\n"
|
||||
"Please check:\n"
|
||||
"1. Your TickTick username/email and password are correct\n"
|
||||
"2. Your account isn't locked or requires 2FA\n"
|
||||
"3. You can log in successfully at https://ticktick.com"
|
||||
)
|
||||
elif "oauth" in error_msg.lower() or "token" in error_msg.lower():
|
||||
raise RuntimeError(
|
||||
f"OAuth authentication failed: {error_msg}\n\n"
|
||||
"Please check:\n"
|
||||
"1. Your OAuth2 credentials (TICKTICK_CLIENT_ID, TICKTICK_CLIENT_SECRET) are correct\n"
|
||||
"2. Your app is properly registered at https://developer.ticktick.com/docs#/openapi\n"
|
||||
"3. The redirect URI is set to: http://localhost:8080\n"
|
||||
"4. Try clearing the token cache: rm ~/.local/share/gtd-terminal-tools/ticktick_tokens.json"
|
||||
)
|
||||
elif "network" in error_msg.lower() or "connection" in error_msg.lower():
|
||||
raise RuntimeError(
|
||||
f"Network connection failed: {error_msg}\n\n"
|
||||
"Please check:\n"
|
||||
"1. Your internet connection\n"
|
||||
"2. If you're behind a corporate firewall, SSL verification is disabled\n"
|
||||
"3. TickTick services are accessible from your network"
|
||||
)
|
||||
elif "Could Not Complete Request" in error_msg:
|
||||
raise RuntimeError(
|
||||
f"TickTick API request failed: {error_msg}\n\n"
|
||||
"This could indicate:\n"
|
||||
"1. Incorrect login credentials (username/password)\n"
|
||||
"2. OAuth2 setup issues (client ID/secret)\n"
|
||||
"3. Network connectivity problems\n"
|
||||
"4. TickTick API service issues\n\n"
|
||||
"Try:\n"
|
||||
"- Verify you can log in at https://ticktick.com\n"
|
||||
"- Check your OAuth2 app settings\n"
|
||||
"- Run: python -m src.cli tt auth-status\n"
|
||||
"- Run: python -m src.cli tt test-auth"
|
||||
)
|
||||
else:
|
||||
raise RuntimeError(f"Failed to initialize TickTick client: {error_msg}")
|
||||
|
||||
|
||||
def clear_token_cache():
|
||||
"""Clear the stored TickTick token cache."""
|
||||
token_file = get_token_file_path()
|
||||
if token_file.exists():
|
||||
token_file.unlink()
|
||||
print(f"Cleared TickTick token cache: {token_file}")
|
||||
else:
|
||||
print("No TickTick token cache found to clear.")
|
||||
329
src/services/ticktick/client.py
Normal file
329
src/services/ticktick/client.py
Normal file
@@ -0,0 +1,329 @@
|
||||
"""
|
||||
TickTick API client service.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
from datetime import datetime, timedelta
|
||||
from typing import List, Dict, Any, Optional, Union
|
||||
from dateutil import parser as date_parser
|
||||
|
||||
from .direct_client import TickTickDirectClient
|
||||
|
||||
|
||||
class TickTickService:
|
||||
"""TickTick API service wrapper using direct OAuth API calls."""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize the TickTick service."""
|
||||
self.client: Optional[TickTickDirectClient] = None
|
||||
self._projects_cache: Optional[List[Dict[str, Any]]] = None
|
||||
|
||||
def _ensure_client(self):
|
||||
"""Ensure the TickTick client is initialized."""
|
||||
if self.client is None:
|
||||
self.client = TickTickDirectClient()
|
||||
|
||||
def get_tasks(
|
||||
self, project_id: Optional[str] = None, completed: bool = False
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Get tasks from TickTick.
|
||||
|
||||
Args:
|
||||
project_id: Filter by specific project ID
|
||||
completed: Whether to include completed tasks
|
||||
|
||||
Returns:
|
||||
List of task dictionaries
|
||||
"""
|
||||
self._ensure_client()
|
||||
|
||||
# Get tasks directly from API
|
||||
if project_id:
|
||||
tasks = self.client.get_tasks(project_id=project_id)
|
||||
else:
|
||||
tasks = self.client.get_tasks()
|
||||
|
||||
# Filter by completion status if needed
|
||||
if not completed:
|
||||
# Filter out completed tasks (status = 2)
|
||||
tasks = [task for task in tasks if task.get("status") != 2]
|
||||
else:
|
||||
# Only completed tasks
|
||||
tasks = [task for task in tasks if task.get("status") == 2]
|
||||
|
||||
return tasks
|
||||
|
||||
def get_projects(self) -> List[Dict[str, Any]]:
|
||||
"""Get all projects."""
|
||||
self._ensure_client()
|
||||
|
||||
if self._projects_cache is None:
|
||||
self._projects_cache = self.client.get_projects()
|
||||
|
||||
return self._projects_cache
|
||||
|
||||
def get_project_by_name(self, name: str) -> Optional[Dict[str, Any]]:
|
||||
"""Find a project by name."""
|
||||
projects = self.get_projects()
|
||||
for project in projects:
|
||||
if project.get("name", "").lower() == name.lower():
|
||||
return project
|
||||
return None
|
||||
|
||||
def get_tasks_by_project(
|
||||
self, project_name: Optional[str] = None
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Get tasks filtered by project name.
|
||||
|
||||
Args:
|
||||
project_name: Name of the project to filter by
|
||||
|
||||
Returns:
|
||||
List of task dictionaries
|
||||
"""
|
||||
if not project_name:
|
||||
return self.get_tasks()
|
||||
|
||||
# Find project by name
|
||||
project = self.get_project_by_name(project_name)
|
||||
if not project:
|
||||
return []
|
||||
|
||||
return self.get_tasks(project_id=project["id"])
|
||||
|
||||
def get_tasks_by_due_date(
|
||||
self, due_date: Union[str, datetime]
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Get tasks filtered by due date.
|
||||
|
||||
Args:
|
||||
due_date: Due date as string or datetime object
|
||||
|
||||
Returns:
|
||||
List of task dictionaries
|
||||
"""
|
||||
if isinstance(due_date, str):
|
||||
if due_date.lower() == "today":
|
||||
target_date = datetime.now().date()
|
||||
elif due_date.lower() == "tomorrow":
|
||||
target_date = (datetime.now() + timedelta(days=1)).date()
|
||||
elif due_date.lower() == "yesterday":
|
||||
target_date = (datetime.now() - timedelta(days=1)).date()
|
||||
else:
|
||||
try:
|
||||
target_date = date_parser.parse(due_date).date()
|
||||
except ValueError:
|
||||
raise ValueError(f"Invalid date format: {due_date}")
|
||||
else:
|
||||
target_date = due_date.date()
|
||||
|
||||
tasks = self.get_tasks()
|
||||
filtered_tasks = []
|
||||
|
||||
for task in tasks:
|
||||
if task.get("dueDate"):
|
||||
try:
|
||||
task_due_date = date_parser.parse(task["dueDate"]).date()
|
||||
if task_due_date == target_date:
|
||||
filtered_tasks.append(task)
|
||||
except (ValueError, TypeError):
|
||||
continue
|
||||
|
||||
return filtered_tasks
|
||||
|
||||
def create_task(
|
||||
self,
|
||||
title: str,
|
||||
project_name: Optional[str] = None,
|
||||
due_date: Optional[str] = None,
|
||||
priority: Optional[int] = None,
|
||||
content: Optional[str] = None,
|
||||
tags: Optional[List[str]] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Create a new task.
|
||||
|
||||
Args:
|
||||
title: Task title
|
||||
project_name: Project name (optional)
|
||||
due_date: Due date string (optional)
|
||||
priority: Priority level 0-5 (optional)
|
||||
content: Task description/content (optional)
|
||||
tags: List of tag names (optional)
|
||||
|
||||
Returns:
|
||||
Created task dictionary
|
||||
"""
|
||||
self._ensure_client()
|
||||
|
||||
# Convert project name to ID if provided
|
||||
project_id = None
|
||||
if project_name:
|
||||
project = self.get_project_by_name(project_name)
|
||||
if project:
|
||||
project_id = project["id"]
|
||||
|
||||
# Process due date
|
||||
processed_due_date = None
|
||||
if due_date:
|
||||
if due_date.lower() == "today":
|
||||
processed_due_date = datetime.now().isoformat()
|
||||
elif due_date.lower() == "tomorrow":
|
||||
processed_due_date = (datetime.now() + timedelta(days=1)).isoformat()
|
||||
else:
|
||||
try:
|
||||
parsed_date = date_parser.parse(due_date)
|
||||
processed_due_date = parsed_date.isoformat()
|
||||
except ValueError:
|
||||
raise ValueError(f"Invalid date format: {due_date}")
|
||||
|
||||
return self.client.create_task(
|
||||
title=title,
|
||||
content=content,
|
||||
project_id=project_id,
|
||||
due_date=processed_due_date,
|
||||
priority=priority,
|
||||
tags=tags,
|
||||
)
|
||||
|
||||
def update_task(
|
||||
self,
|
||||
task_id: str,
|
||||
title: Optional[str] = None,
|
||||
project_name: Optional[str] = None,
|
||||
due_date: Optional[str] = None,
|
||||
priority: Optional[int] = None,
|
||||
content: Optional[str] = None,
|
||||
tags: Optional[List[str]] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Update an existing task.
|
||||
|
||||
Args:
|
||||
task_id: Task ID to update
|
||||
title: New title (optional)
|
||||
project_name: New project name (optional)
|
||||
due_date: New due date (optional)
|
||||
priority: New priority (optional)
|
||||
content: New content (optional)
|
||||
tags: New tags (optional)
|
||||
|
||||
Returns:
|
||||
Updated task dictionary
|
||||
"""
|
||||
self._ensure_client()
|
||||
|
||||
update_data = {}
|
||||
|
||||
if title:
|
||||
update_data["title"] = title
|
||||
if content:
|
||||
update_data["content"] = content
|
||||
if priority is not None:
|
||||
update_data["priority"] = priority
|
||||
if tags:
|
||||
update_data["tags"] = tags
|
||||
|
||||
# Convert project name to ID if provided
|
||||
if project_name:
|
||||
project = self.get_project_by_name(project_name)
|
||||
if project:
|
||||
update_data["projectId"] = project["id"]
|
||||
|
||||
# Process due date
|
||||
if due_date:
|
||||
if due_date.lower() == "today":
|
||||
update_data["dueDate"] = datetime.now().isoformat()
|
||||
elif due_date.lower() == "tomorrow":
|
||||
update_data["dueDate"] = (
|
||||
datetime.now() + timedelta(days=1)
|
||||
).isoformat()
|
||||
else:
|
||||
try:
|
||||
parsed_date = date_parser.parse(due_date)
|
||||
update_data["dueDate"] = parsed_date.isoformat()
|
||||
except ValueError:
|
||||
raise ValueError(f"Invalid date format: {due_date}")
|
||||
|
||||
return self.client.update_task(task_id, **update_data)
|
||||
|
||||
def complete_task(self, task_id: str) -> Dict[str, Any]:
|
||||
"""Mark a task as completed."""
|
||||
self._ensure_client()
|
||||
return self.client.complete_task(task_id)
|
||||
|
||||
def delete_task(self, task_id: str) -> bool:
|
||||
"""Delete a task."""
|
||||
self._ensure_client()
|
||||
return self.client.delete_task(task_id)
|
||||
|
||||
def get_task(self, task_id: str) -> Dict[str, Any]:
|
||||
"""Get a specific task by ID."""
|
||||
self._ensure_client()
|
||||
return self.client.get_task(task_id)
|
||||
|
||||
def sync(self):
|
||||
"""Sync with TickTick servers (clear cache)."""
|
||||
self._projects_cache = None
|
||||
|
||||
def search_tasks(self, query: str) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Search tasks by title or content.
|
||||
|
||||
Args:
|
||||
query: Search query string
|
||||
|
||||
Returns:
|
||||
List of matching task dictionaries
|
||||
"""
|
||||
tasks = self.get_tasks()
|
||||
query_lower = query.lower()
|
||||
|
||||
matching_tasks = []
|
||||
for task in tasks:
|
||||
title = task.get("title", "").lower()
|
||||
content = task.get("content", "").lower()
|
||||
|
||||
if query_lower in title or query_lower in content:
|
||||
matching_tasks.append(task)
|
||||
|
||||
return matching_tasks
|
||||
|
||||
def get_tasks_by_priority(self, min_priority: int = 1) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Get tasks filtered by priority level.
|
||||
|
||||
Args:
|
||||
min_priority: Minimum priority level (1-5)
|
||||
|
||||
Returns:
|
||||
List of task dictionaries
|
||||
"""
|
||||
tasks = self.get_tasks()
|
||||
return [task for task in tasks if task.get("priority", 0) >= min_priority]
|
||||
|
||||
def get_tasks_by_tags(self, tag_names: List[str]) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Get tasks that have any of the specified tags.
|
||||
|
||||
Args:
|
||||
tag_names: List of tag names to search for
|
||||
|
||||
Returns:
|
||||
List of task dictionaries
|
||||
"""
|
||||
tasks = self.get_tasks()
|
||||
tag_names_lower = [tag.lower() for tag in tag_names]
|
||||
|
||||
matching_tasks = []
|
||||
for task in tasks:
|
||||
task_tags = task.get("tags", [])
|
||||
task_tags_lower = [tag.lower() for tag in task_tags]
|
||||
|
||||
if any(tag in task_tags_lower for tag in tag_names_lower):
|
||||
matching_tasks.append(task)
|
||||
|
||||
return matching_tasks
|
||||
144
src/services/ticktick/direct_client.py
Normal file
144
src/services/ticktick/direct_client.py
Normal file
@@ -0,0 +1,144 @@
|
||||
"""
|
||||
Direct TickTick API client using only OAuth tokens.
|
||||
This bypasses the flawed ticktick-py library that incorrectly requires username/password.
|
||||
"""
|
||||
|
||||
import requests
|
||||
import urllib3
|
||||
from typing import Optional, Dict, List, Any
|
||||
from datetime import datetime
|
||||
import logging
|
||||
|
||||
from .auth import load_stored_tokens
|
||||
|
||||
# Suppress SSL warnings for corporate networks
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
|
||||
|
||||
class TickTickDirectClient:
|
||||
"""Direct TickTick API client using OAuth only."""
|
||||
|
||||
BASE_URL = "https://api.ticktick.com/open/v1"
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize the client with OAuth token."""
|
||||
self.tokens = load_stored_tokens()
|
||||
if not self.tokens:
|
||||
raise RuntimeError(
|
||||
"No OAuth tokens found. Please run authentication first."
|
||||
)
|
||||
|
||||
self.access_token = self.tokens["access_token"]
|
||||
self.session = requests.Session()
|
||||
self.session.verify = False # Disable SSL verification for corporate networks
|
||||
|
||||
# Set headers
|
||||
self.session.headers.update(
|
||||
{
|
||||
"Authorization": f"Bearer {self.access_token}",
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
)
|
||||
|
||||
def _request(self, method: str, endpoint: str, **kwargs) -> requests.Response:
|
||||
"""Make a request to the TickTick API."""
|
||||
url = f"{self.BASE_URL}/{endpoint.lstrip('/')}"
|
||||
|
||||
try:
|
||||
response = self.session.request(method, url, **kwargs)
|
||||
response.raise_for_status()
|
||||
return response
|
||||
except requests.exceptions.HTTPError as e:
|
||||
if response.status_code == 401:
|
||||
raise RuntimeError(
|
||||
"OAuth token expired or invalid. Please re-authenticate."
|
||||
)
|
||||
elif response.status_code == 429:
|
||||
raise RuntimeError("Rate limit exceeded. Please try again later.")
|
||||
else:
|
||||
raise RuntimeError(f"API request failed: {e}")
|
||||
|
||||
def get_projects(self) -> List[Dict[str, Any]]:
|
||||
"""Get all projects."""
|
||||
response = self._request("GET", "/project")
|
||||
return response.json()
|
||||
|
||||
def get_tasks(self, project_id: Optional[str] = None) -> List[Dict[str, Any]]:
|
||||
"""Get tasks, optionally filtered by project."""
|
||||
# NOTE: TickTick's GET /task endpoint appears to have issues (returns 500)
|
||||
# This is a known limitation of their API
|
||||
# For now, we'll return an empty list and log the issue
|
||||
|
||||
import logging
|
||||
|
||||
logging.warning(
|
||||
"TickTick GET /task endpoint returns 500 server error - this is a known API issue"
|
||||
)
|
||||
|
||||
# TODO: Implement alternative task fetching when TickTick fixes their API
|
||||
# Possible workarounds:
|
||||
# 1. Use websocket/sync endpoints
|
||||
# 2. Cache created tasks locally
|
||||
# 3. Use different API version when available
|
||||
|
||||
return []
|
||||
|
||||
def create_task(
|
||||
self,
|
||||
title: str,
|
||||
content: Optional[str] = None,
|
||||
project_id: Optional[str] = None,
|
||||
due_date: Optional[str] = None,
|
||||
priority: Optional[int] = None,
|
||||
tags: Optional[List[str]] = None,
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a new task."""
|
||||
task_data = {"title": title}
|
||||
|
||||
if content:
|
||||
task_data["content"] = content
|
||||
if project_id:
|
||||
task_data["projectId"] = project_id
|
||||
if due_date:
|
||||
# Convert date string to ISO format if needed
|
||||
task_data["dueDate"] = due_date
|
||||
if priority is not None:
|
||||
task_data["priority"] = priority
|
||||
if tags:
|
||||
task_data["tags"] = tags
|
||||
|
||||
response = self._request("POST", "/task", json=task_data)
|
||||
return response.json()
|
||||
|
||||
def update_task(self, task_id: str, **kwargs) -> Dict[str, Any]:
|
||||
"""Update an existing task."""
|
||||
response = self._request("POST", f"/task/{task_id}", json=kwargs)
|
||||
return response.json()
|
||||
|
||||
def complete_task(self, task_id: str) -> Dict[str, Any]:
|
||||
"""Mark a task as completed."""
|
||||
return self.update_task(task_id, status=2) # 2 = completed
|
||||
|
||||
def delete_task(self, task_id: str) -> bool:
|
||||
"""Delete a task."""
|
||||
try:
|
||||
self._request("DELETE", f"/task/{task_id}")
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def get_task(self, task_id: str) -> Dict[str, Any]:
|
||||
"""Get a specific task by ID."""
|
||||
# NOTE: TickTick's GET /task/{id} endpoint also returns 500 server error
|
||||
import logging
|
||||
|
||||
logging.warning(
|
||||
f"TickTick GET /task/{task_id} endpoint returns 500 server error - this is a known API issue"
|
||||
)
|
||||
|
||||
# Return minimal task info
|
||||
return {
|
||||
"id": task_id,
|
||||
"title": "Task details unavailable (API issue)",
|
||||
"status": 0,
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
"""
|
||||
Utility module for handling calendar events and iCalendar operations.
|
||||
"""
|
||||
|
||||
import re
|
||||
import os
|
||||
from datetime import datetime, timedelta
|
||||
@@ -40,7 +41,7 @@ def clean_text(text):
|
||||
if not text:
|
||||
return ""
|
||||
# Replace 3 or more consecutive underscores with 2 underscores
|
||||
return re.sub(r'_{3,}', '__', text)
|
||||
return re.sub(r"_{3,}", "__", text)
|
||||
|
||||
|
||||
def escape_ical_text(text):
|
||||
@@ -63,8 +64,15 @@ def escape_ical_text(text):
|
||||
text = text.replace(";", "\\;")
|
||||
return text
|
||||
|
||||
async def fetch_calendar_events(headers, days_back=1, days_forward=6, fetch_function=None,
|
||||
start_date=None, end_date=None):
|
||||
|
||||
async def fetch_calendar_events(
|
||||
headers,
|
||||
days_back=1,
|
||||
days_forward=6,
|
||||
fetch_function=None,
|
||||
start_date=None,
|
||||
end_date=None,
|
||||
):
|
||||
"""
|
||||
Fetch calendar events from Microsoft Graph API.
|
||||
|
||||
@@ -85,7 +93,9 @@ async def fetch_calendar_events(headers, days_back=1, days_forward=6, fetch_func
|
||||
|
||||
# Calculate date range
|
||||
if start_date is None:
|
||||
start_date = datetime.now().replace(hour=0, minute=0, second=0) - timedelta(days=days_back)
|
||||
start_date = datetime.now().replace(hour=0, minute=0, second=0) - timedelta(
|
||||
days=days_back
|
||||
)
|
||||
|
||||
if end_date is None:
|
||||
end_of_today = datetime.now().replace(hour=23, minute=59, second=59)
|
||||
@@ -99,7 +109,7 @@ async def fetch_calendar_events(headers, days_back=1, days_forward=6, fetch_func
|
||||
total_event_url = f"{event_base_url}&$count=true&$select=id"
|
||||
try:
|
||||
total_response = await fetch_function(total_event_url, headers)
|
||||
total_events = total_response.get('@odata.count', 0)
|
||||
total_events = total_response.get("@odata.count", 0)
|
||||
except Exception as e:
|
||||
print(f"Error fetching total events count: {e}")
|
||||
total_events = 0
|
||||
@@ -110,9 +120,9 @@ async def fetch_calendar_events(headers, days_back=1, days_forward=6, fetch_func
|
||||
try:
|
||||
response_data = await fetch_function(calendar_url, headers)
|
||||
if response_data:
|
||||
events.extend(response_data.get('value', []))
|
||||
events.extend(response_data.get("value", []))
|
||||
# Get the next page URL from @odata.nextLink
|
||||
calendar_url = response_data.get('@odata.nextLink')
|
||||
calendar_url = response_data.get("@odata.nextLink")
|
||||
else:
|
||||
print("Received empty response from calendar API")
|
||||
break
|
||||
@@ -123,6 +133,7 @@ async def fetch_calendar_events(headers, days_back=1, days_forward=6, fetch_func
|
||||
# Only return the events and total_events
|
||||
return events, total_events
|
||||
|
||||
|
||||
def write_event_to_ical(f, event, start, end):
|
||||
"""
|
||||
Write a single event to an iCalendar file.
|
||||
@@ -140,52 +151,57 @@ def write_event_to_ical(f, event, start, end):
|
||||
f.write(f"BEGIN:VEVENT\nSUMMARY:{escape_ical_text(event['subject'])}\n")
|
||||
|
||||
# Handle multi-line description properly
|
||||
description = event.get('bodyPreview', '')
|
||||
description = event.get("bodyPreview", "")
|
||||
if description:
|
||||
escaped_description = escape_ical_text(description)
|
||||
f.write(f"DESCRIPTION:{escaped_description}\n")
|
||||
|
||||
f.write(f"UID:{event.get('iCalUId', '')}\n")
|
||||
f.write(f"LOCATION:{escape_ical_text(event.get('location', {}).get('displayName', ''))}\n")
|
||||
f.write(
|
||||
f"LOCATION:{escape_ical_text(event.get('location', {}).get('displayName', ''))}\n"
|
||||
)
|
||||
f.write(f"CLASS:{event.get('showAs', '')}\n")
|
||||
f.write(f"STATUS:{event.get('responseStatus', {}).get('response', '')}\n")
|
||||
|
||||
if 'onlineMeeting' in event and event['onlineMeeting']:
|
||||
if "onlineMeeting" in event and event["onlineMeeting"]:
|
||||
f.write(f"URL:{event.get('onlineMeeting', {}).get('joinUrl', '')}\n")
|
||||
|
||||
# Write start and end times with timezone info in iCalendar format
|
||||
if start.tzinfo == UTC:
|
||||
f.write(f"DTSTART:{start.strftime('%Y%m%dT%H%M%SZ')}\n")
|
||||
else:
|
||||
tz_name = start_tz.tzname(None) if start_tz else 'UTC'
|
||||
tz_name = start_tz.tzname(None) if start_tz else "UTC"
|
||||
f.write(f"DTSTART;TZID={tz_name}:{start.strftime('%Y%m%dT%H%M%S')}\n")
|
||||
|
||||
if end.tzinfo == UTC:
|
||||
f.write(f"DTEND:{end.strftime('%Y%m%dT%H%M%SZ')}\n")
|
||||
else:
|
||||
tz_name = end_tz.tzname(None) if end_tz else 'UTC'
|
||||
tz_name = end_tz.tzname(None) if end_tz else "UTC"
|
||||
f.write(f"DTEND;TZID={tz_name}:{end.strftime('%Y%m%dT%H%M%S')}\n")
|
||||
|
||||
# Handle recurrence rules
|
||||
if 'recurrence' in event and event['recurrence']:
|
||||
for rule in event['recurrence']:
|
||||
if rule.startswith('RRULE'):
|
||||
rule_parts = rule.split(';')
|
||||
if "recurrence" in event and event["recurrence"]:
|
||||
for rule in event["recurrence"]:
|
||||
if rule.startswith("RRULE"):
|
||||
rule_parts = rule.split(";")
|
||||
new_rule_parts = []
|
||||
for part in rule_parts:
|
||||
if part.startswith('UNTIL='):
|
||||
until_value = part.split('=')[1]
|
||||
if part.startswith("UNTIL="):
|
||||
until_value = part.split("=")[1]
|
||||
until_date = parser.isoparse(until_value)
|
||||
if start.tzinfo is not None and until_date.tzinfo is None:
|
||||
until_date = until_date.replace(tzinfo=start.tzinfo)
|
||||
new_rule_parts.append(f"UNTIL={until_date.strftime('%Y%m%dT%H%M%SZ')}")
|
||||
new_rule_parts.append(
|
||||
f"UNTIL={until_date.strftime('%Y%m%dT%H%M%SZ')}"
|
||||
)
|
||||
else:
|
||||
new_rule_parts.append(part)
|
||||
rule = ';'.join(new_rule_parts)
|
||||
rule = ";".join(new_rule_parts)
|
||||
f.write(f"{rule}\n")
|
||||
|
||||
f.write("END:VEVENT\n")
|
||||
|
||||
|
||||
def save_events_to_vdir(events, org_vdir_path, progress, task_id, dry_run=False):
|
||||
"""
|
||||
Save events to vdir format (one file per event).
|
||||
@@ -201,7 +217,9 @@ def save_events_to_vdir(events, org_vdir_path, progress, task_id, dry_run=False)
|
||||
Number of events processed
|
||||
"""
|
||||
if dry_run:
|
||||
progress.console.print(f"[DRY-RUN] Would save {len(events)} events to vdir format in {org_vdir_path}")
|
||||
progress.console.print(
|
||||
f"[DRY-RUN] Would save {len(events)} events to vdir format in {org_vdir_path}"
|
||||
)
|
||||
return len(events)
|
||||
|
||||
os.makedirs(org_vdir_path, exist_ok=True)
|
||||
@@ -212,29 +230,26 @@ def save_events_to_vdir(events, org_vdir_path, progress, task_id, dry_run=False)
|
||||
for file_path in glob.glob(os.path.join(org_vdir_path, "*.ics")):
|
||||
file_name = os.path.basename(file_path)
|
||||
file_mod_time = os.path.getmtime(file_path)
|
||||
existing_files[file_name] = {
|
||||
'path': file_path,
|
||||
'mtime': file_mod_time
|
||||
}
|
||||
existing_files[file_name] = {"path": file_path, "mtime": file_mod_time}
|
||||
|
||||
processed_files = set()
|
||||
|
||||
for event in events:
|
||||
progress.advance(task_id)
|
||||
if 'start' not in event or 'end' not in event:
|
||||
if "start" not in event or "end" not in event:
|
||||
continue
|
||||
|
||||
# Parse start and end times with timezone information
|
||||
start = parser.isoparse(event['start']['dateTime'])
|
||||
end = parser.isoparse(event['end']['dateTime'])
|
||||
start = parser.isoparse(event["start"]["dateTime"])
|
||||
end = parser.isoparse(event["end"]["dateTime"])
|
||||
|
||||
uid = event.get('iCalUId', '')
|
||||
uid = event.get("iCalUId", "")
|
||||
if not uid:
|
||||
# Generate a unique ID if none exists
|
||||
uid = f"outlook-{event.get('id', '')}"
|
||||
|
||||
# Create a filename based on the UID
|
||||
safe_filename = re.sub(r'[^\w\-]', '_', uid) + ".ics"
|
||||
safe_filename = re.sub(r"[^\w\-]", "_", uid) + ".ics"
|
||||
event_path = os.path.join(org_vdir_path, safe_filename)
|
||||
processed_files.add(safe_filename)
|
||||
|
||||
@@ -242,15 +257,19 @@ def save_events_to_vdir(events, org_vdir_path, progress, task_id, dry_run=False)
|
||||
should_update = True
|
||||
if safe_filename in existing_files:
|
||||
# Only update if the event has been modified since the file was last updated
|
||||
if 'lastModifiedDateTime' in event:
|
||||
last_modified = parser.isoparse(event['lastModifiedDateTime']).timestamp()
|
||||
file_mtime = existing_files[safe_filename]['mtime']
|
||||
if "lastModifiedDateTime" in event:
|
||||
last_modified = parser.isoparse(
|
||||
event["lastModifiedDateTime"]
|
||||
).timestamp()
|
||||
file_mtime = existing_files[safe_filename]["mtime"]
|
||||
if last_modified <= file_mtime:
|
||||
should_update = False
|
||||
progress.console.print(f"Skipping unchanged event: {event['subject']}")
|
||||
progress.console.print(
|
||||
f"Skipping unchanged event: {event['subject']}"
|
||||
)
|
||||
|
||||
if should_update:
|
||||
with open(event_path, 'w') as f:
|
||||
with open(event_path, "w") as f:
|
||||
f.write("BEGIN:VCALENDAR\nVERSION:2.0\n")
|
||||
write_event_to_ical(f, event, start, end)
|
||||
f.write("END:VCALENDAR\n")
|
||||
@@ -258,12 +277,24 @@ def save_events_to_vdir(events, org_vdir_path, progress, task_id, dry_run=False)
|
||||
# Remove files for events that no longer exist in the calendar view
|
||||
for file_name in existing_files:
|
||||
if file_name not in processed_files:
|
||||
progress.console.print(f"Removing obsolete event file: {truncate_id(file_name)}")
|
||||
os.remove(existing_files[file_name]['path'])
|
||||
progress.console.print(
|
||||
f"Removing obsolete event file: {truncate_id(file_name)}"
|
||||
)
|
||||
os.remove(existing_files[file_name]["path"])
|
||||
|
||||
# Create sync timestamp to track when this download sync completed
|
||||
timestamp_file = os.path.join(org_vdir_path, ".sync_timestamp")
|
||||
try:
|
||||
with open(timestamp_file, "w") as f:
|
||||
f.write(str(datetime.now().timestamp()))
|
||||
progress.console.print(f"Updated sync timestamp for calendar monitoring")
|
||||
except IOError as e:
|
||||
progress.console.print(f"Warning: Could not create sync timestamp: {e}")
|
||||
|
||||
progress.console.print(f"Saved {len(events)} events to {org_vdir_path}")
|
||||
return len(events)
|
||||
|
||||
|
||||
def save_events_to_file(events, output_file, progress, task_id, dry_run=False):
|
||||
"""
|
||||
Save all events to a single iCalendar file.
|
||||
@@ -285,14 +316,14 @@ def save_events_to_file(events, output_file, progress, task_id, dry_run=False):
|
||||
os.makedirs(os.path.dirname(output_file), exist_ok=True)
|
||||
progress.console.print(f"Saving events to {output_file}...")
|
||||
|
||||
with open(output_file, 'w') as f:
|
||||
with open(output_file, "w") as f:
|
||||
f.write("BEGIN:VCALENDAR\nVERSION:2.0\n")
|
||||
for event in events:
|
||||
progress.advance(task_id)
|
||||
if 'start' in event and 'end' in event:
|
||||
if "start" in event and "end" in event:
|
||||
# Parse start and end times with timezone information
|
||||
start = parser.isoparse(event['start']['dateTime'])
|
||||
end = parser.isoparse(event['end']['dateTime'])
|
||||
start = parser.isoparse(event["start"]["dateTime"])
|
||||
end = parser.isoparse(event["end"]["dateTime"])
|
||||
write_event_to_ical(f, event, start, end)
|
||||
f.write("END:VCALENDAR\n")
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
"""
|
||||
Maildir operations for handling local mail storage.
|
||||
"""
|
||||
|
||||
import os
|
||||
import email
|
||||
import base64
|
||||
@@ -11,11 +12,30 @@ from email import encoders
|
||||
import time
|
||||
import aiohttp
|
||||
import re
|
||||
import logging
|
||||
|
||||
# Suppress HTTP library debug logging
|
||||
logging.getLogger("aiohttp").setLevel(logging.ERROR)
|
||||
logging.getLogger("aiohttp.access").setLevel(logging.ERROR)
|
||||
|
||||
from src.utils.calendar_utils import truncate_id
|
||||
from src.utils.mail_utils.helpers import safe_filename, ensure_directory_exists, format_datetime, format_mime_date
|
||||
from src.utils.mail_utils.helpers import (
|
||||
safe_filename,
|
||||
ensure_directory_exists,
|
||||
format_datetime,
|
||||
format_mime_date,
|
||||
)
|
||||
|
||||
async def save_mime_to_maildir_async(maildir_path, message, attachments_dir, headers, progress, dry_run=False, download_attachments=False):
|
||||
|
||||
async def save_mime_to_maildir_async(
|
||||
maildir_path,
|
||||
message,
|
||||
attachments_dir,
|
||||
headers,
|
||||
progress,
|
||||
dry_run=False,
|
||||
download_attachments=False,
|
||||
):
|
||||
"""
|
||||
Save a message from Microsoft Graph API to a Maildir.
|
||||
|
||||
@@ -31,30 +51,39 @@ async def save_mime_to_maildir_async(maildir_path, message, attachments_dir, hea
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
message_id = message.get('id', '')
|
||||
message_id = message.get("id", "")
|
||||
|
||||
# Determine target directory based on read status
|
||||
target_dir = os.path.join(maildir_path, 'cur' if message.get('isRead', False) else 'new')
|
||||
target_dir = os.path.join(
|
||||
maildir_path, "cur" if message.get("isRead", False) else "new"
|
||||
)
|
||||
ensure_directory_exists(target_dir)
|
||||
|
||||
# Check if the file already exists in either new or cur
|
||||
new_path = os.path.join(maildir_path, 'new', f"{message_id}.eml")
|
||||
cur_path = os.path.join(maildir_path, 'cur', f"{message_id}.eml")
|
||||
new_path = os.path.join(maildir_path, "new", f"{message_id}.eml")
|
||||
cur_path = os.path.join(maildir_path, "cur", f"{message_id}.eml")
|
||||
|
||||
if os.path.exists(new_path) or os.path.exists(cur_path):
|
||||
return # Skip if already exists
|
||||
|
||||
# Create MIME email
|
||||
mime_msg = await create_mime_message_async(message, headers, attachments_dir, progress, download_attachments)
|
||||
mime_msg = await create_mime_message_async(
|
||||
message, headers, attachments_dir, progress, download_attachments
|
||||
)
|
||||
|
||||
# Only save file if not in dry run mode
|
||||
if not dry_run:
|
||||
with open(os.path.join(target_dir, f"{message_id}.eml"), 'wb') as f:
|
||||
with open(os.path.join(target_dir, f"{message_id}.eml"), "wb") as f:
|
||||
f.write(mime_msg.as_bytes())
|
||||
else:
|
||||
progress.console.print(f"[DRY-RUN] Would save message: {message.get('subject', 'No Subject')}")
|
||||
progress.console.print(
|
||||
f"[DRY-RUN] Would save message: {message.get('subject', 'No Subject')}"
|
||||
)
|
||||
|
||||
async def create_mime_message_async(message, headers, attachments_dir, progress, download_attachments=False):
|
||||
|
||||
async def create_mime_message_async(
|
||||
message, headers, attachments_dir, progress, download_attachments=False
|
||||
):
|
||||
"""
|
||||
Create a MIME message from Microsoft Graph API message data.
|
||||
|
||||
@@ -72,33 +101,41 @@ async def create_mime_message_async(message, headers, attachments_dir, progress,
|
||||
mime_msg = MIMEMultipart()
|
||||
|
||||
# Message headers
|
||||
mime_msg['Message-ID'] = message.get('id', '')
|
||||
mime_msg['Subject'] = message.get('subject', 'No Subject')
|
||||
mime_msg["Message-ID"] = message.get("id", "")
|
||||
mime_msg["Subject"] = message.get("subject", "No Subject")
|
||||
|
||||
# Sender information
|
||||
sender = message.get('from', {}).get('emailAddress', {})
|
||||
sender = message.get("from", {}).get("emailAddress", {})
|
||||
if sender:
|
||||
mime_msg['From'] = f"{sender.get('name', '')} <{sender.get('address', '')}>".strip()
|
||||
mime_msg["From"] = (
|
||||
f"{sender.get('name', '')} <{sender.get('address', '')}>".strip()
|
||||
)
|
||||
|
||||
# Recipients
|
||||
to_recipients = message.get('toRecipients', [])
|
||||
cc_recipients = message.get('ccRecipients', [])
|
||||
to_recipients = message.get("toRecipients", [])
|
||||
cc_recipients = message.get("ccRecipients", [])
|
||||
|
||||
if to_recipients:
|
||||
to_list = [f"{r.get('emailAddress', {}).get('name', '')} <{r.get('emailAddress', {}).get('address', '')}>".strip() for r in to_recipients]
|
||||
mime_msg['To'] = ', '.join(to_list)
|
||||
to_list = [
|
||||
f"{r.get('emailAddress', {}).get('name', '')} <{r.get('emailAddress', {}).get('address', '')}>".strip()
|
||||
for r in to_recipients
|
||||
]
|
||||
mime_msg["To"] = ", ".join(to_list)
|
||||
|
||||
if cc_recipients:
|
||||
cc_list = [f"{r.get('emailAddress', {}).get('name', '')} <{r.get('emailAddress', {}).get('address', '')}>".strip() for r in cc_recipients]
|
||||
mime_msg['Cc'] = ', '.join(cc_list)
|
||||
cc_list = [
|
||||
f"{r.get('emailAddress', {}).get('name', '')} <{r.get('emailAddress', {}).get('address', '')}>".strip()
|
||||
for r in cc_recipients
|
||||
]
|
||||
mime_msg["Cc"] = ", ".join(cc_list)
|
||||
|
||||
# Date - using the new format_mime_date function to ensure RFC 5322 compliance
|
||||
received_datetime = message.get('receivedDateTime', '')
|
||||
received_datetime = message.get("receivedDateTime", "")
|
||||
if received_datetime:
|
||||
mime_msg['Date'] = format_mime_date(received_datetime)
|
||||
mime_msg["Date"] = format_mime_date(received_datetime)
|
||||
|
||||
# First try the direct body content approach
|
||||
message_id = message.get('id', '')
|
||||
message_id = message.get("id", "")
|
||||
try:
|
||||
# First get the message with body content
|
||||
body_url = f"https://graph.microsoft.com/v1.0/me/messages/{message_id}?$select=body,bodyPreview"
|
||||
@@ -108,46 +145,62 @@ async def create_mime_message_async(message, headers, attachments_dir, progress,
|
||||
body_data = await response.json()
|
||||
|
||||
# Get body content
|
||||
body_content = body_data.get('body', {}).get('content', '')
|
||||
body_type = body_data.get('body', {}).get('contentType', 'text')
|
||||
body_preview = body_data.get('bodyPreview', '')
|
||||
body_content = body_data.get("body", {}).get("content", "")
|
||||
body_type = body_data.get("body", {}).get("contentType", "text")
|
||||
body_preview = body_data.get("bodyPreview", "")
|
||||
|
||||
# If we have body content, use it
|
||||
if body_content:
|
||||
if body_type.lower() == 'html':
|
||||
if body_type.lower() == "html":
|
||||
# Add both HTML and plain text versions
|
||||
# Plain text conversion
|
||||
plain_text = re.sub(r'<br\s*/?>', '\n', body_content)
|
||||
plain_text = re.sub(r'<[^>]*>', '', plain_text)
|
||||
plain_text = re.sub(r"<br\s*/?>", "\n", body_content)
|
||||
plain_text = re.sub(r"<[^>]*>", "", plain_text)
|
||||
|
||||
mime_msg.attach(MIMEText(plain_text, 'plain'))
|
||||
mime_msg.attach(MIMEText(body_content, 'html'))
|
||||
mime_msg.attach(MIMEText(plain_text, "plain"))
|
||||
mime_msg.attach(MIMEText(body_content, "html"))
|
||||
else:
|
||||
# Just plain text
|
||||
mime_msg.attach(MIMEText(body_content, 'plain'))
|
||||
mime_msg.attach(MIMEText(body_content, "plain"))
|
||||
elif body_preview:
|
||||
# Use preview if we have it
|
||||
mime_msg.attach(MIMEText(f"{body_preview}\n\n[Message preview only. Full content not available.]", 'plain'))
|
||||
mime_msg.attach(
|
||||
MIMEText(
|
||||
f"{body_preview}\n\n[Message preview only. Full content not available.]",
|
||||
"plain",
|
||||
)
|
||||
)
|
||||
else:
|
||||
# Fallback to MIME content
|
||||
progress.console.print(f"No direct body content for message {truncate_id(message_id)}, trying MIME content...")
|
||||
await fetch_mime_content(mime_msg, message_id, headers, progress)
|
||||
progress.console.print(
|
||||
f"No direct body content for message {truncate_id(message_id)}, trying MIME content..."
|
||||
)
|
||||
await fetch_mime_content(
|
||||
mime_msg, message_id, headers, progress
|
||||
)
|
||||
else:
|
||||
progress.console.print(f"Failed to get message body: {response.status}. Trying MIME content...")
|
||||
progress.console.print(
|
||||
f"Failed to get message body: {response.status}. Trying MIME content..."
|
||||
)
|
||||
await fetch_mime_content(mime_msg, message_id, headers, progress)
|
||||
except Exception as e:
|
||||
progress.console.print(f"Error getting message body: {e}. Trying MIME content...")
|
||||
progress.console.print(
|
||||
f"Error getting message body: {e}. Trying MIME content..."
|
||||
)
|
||||
await fetch_mime_content(mime_msg, message_id, headers, progress)
|
||||
|
||||
# Handle attachments only if we want to download them
|
||||
if download_attachments:
|
||||
await add_attachments_async(mime_msg, message, headers, attachments_dir, progress)
|
||||
await add_attachments_async(
|
||||
mime_msg, message, headers, attachments_dir, progress
|
||||
)
|
||||
else:
|
||||
# Add a header to indicate attachment info was skipped
|
||||
mime_msg['X-Attachments-Skipped'] = 'True'
|
||||
mime_msg["X-Attachments-Skipped"] = "True"
|
||||
|
||||
return mime_msg
|
||||
|
||||
|
||||
async def fetch_mime_content(mime_msg, message_id, headers, progress):
|
||||
"""
|
||||
Fetch and add MIME content to a message when direct body access fails.
|
||||
@@ -159,7 +212,9 @@ async def fetch_mime_content(mime_msg, message_id, headers, progress):
|
||||
progress: Progress instance for updating progress bars.
|
||||
"""
|
||||
# Fallback to getting the MIME content
|
||||
message_content_url = f"https://graph.microsoft.com/v1.0/me/messages/{message_id}/$value"
|
||||
message_content_url = (
|
||||
f"https://graph.microsoft.com/v1.0/me/messages/{message_id}/$value"
|
||||
)
|
||||
try:
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.get(message_content_url, headers=headers) as response:
|
||||
@@ -167,41 +222,58 @@ async def fetch_mime_content(mime_msg, message_id, headers, progress):
|
||||
full_content = await response.text()
|
||||
|
||||
# Check for body tags
|
||||
body_match = re.search(r'<body[^>]*>(.*?)</body>', full_content, re.DOTALL | re.IGNORECASE)
|
||||
body_match = re.search(
|
||||
r"<body[^>]*>(.*?)</body>",
|
||||
full_content,
|
||||
re.DOTALL | re.IGNORECASE,
|
||||
)
|
||||
if body_match:
|
||||
body_content = body_match.group(1)
|
||||
# Simple HTML to text conversion
|
||||
body_text = re.sub(r'<br\s*/?>', '\n', body_content)
|
||||
body_text = re.sub(r'<[^>]*>', '', body_text)
|
||||
body_text = re.sub(r"<br\s*/?>", "\n", body_content)
|
||||
body_text = re.sub(r"<[^>]*>", "", body_text)
|
||||
|
||||
# Add the plain text body
|
||||
mime_msg.attach(MIMEText(body_text, 'plain'))
|
||||
mime_msg.attach(MIMEText(body_text, "plain"))
|
||||
|
||||
# Also add the HTML body
|
||||
mime_msg.attach(MIMEText(full_content, 'html'))
|
||||
mime_msg.attach(MIMEText(full_content, "html"))
|
||||
else:
|
||||
# Fallback - try to find content between Content-Type: text/html and next boundary
|
||||
html_parts = re.findall(r'Content-Type: text/html.*?\r?\n\r?\n(.*?)(?:\r?\n\r?\n|$)',
|
||||
full_content, re.DOTALL | re.IGNORECASE)
|
||||
html_parts = re.findall(
|
||||
r"Content-Type: text/html.*?\r?\n\r?\n(.*?)(?:\r?\n\r?\n|$)",
|
||||
full_content,
|
||||
re.DOTALL | re.IGNORECASE,
|
||||
)
|
||||
if html_parts:
|
||||
html_content = html_parts[0]
|
||||
mime_msg.attach(MIMEText(html_content, 'html'))
|
||||
mime_msg.attach(MIMEText(html_content, "html"))
|
||||
|
||||
# Also make plain text version
|
||||
plain_text = re.sub(r'<br\s*/?>', '\n', html_content)
|
||||
plain_text = re.sub(r'<[^>]*>', '', plain_text)
|
||||
mime_msg.attach(MIMEText(plain_text, 'plain'))
|
||||
plain_text = re.sub(r"<br\s*/?>", "\n", html_content)
|
||||
plain_text = re.sub(r"<[^>]*>", "", plain_text)
|
||||
mime_msg.attach(MIMEText(plain_text, "plain"))
|
||||
else:
|
||||
# Just use the raw content as text if nothing else works
|
||||
mime_msg.attach(MIMEText(full_content, 'plain'))
|
||||
progress.console.print(f"Using raw content for message {message_id} - no body tags found")
|
||||
mime_msg.attach(MIMEText(full_content, "plain"))
|
||||
progress.console.print(
|
||||
f"Using raw content for message {message_id} - no body tags found"
|
||||
)
|
||||
else:
|
||||
error_text = await response.text()
|
||||
progress.console.print(f"Failed to get MIME content: {response.status} {error_text}")
|
||||
mime_msg.attach(MIMEText(f"Failed to retrieve message body: HTTP {response.status}", 'plain'))
|
||||
progress.console.print(
|
||||
f"Failed to get MIME content: {response.status} {error_text}"
|
||||
)
|
||||
mime_msg.attach(
|
||||
MIMEText(
|
||||
f"Failed to retrieve message body: HTTP {response.status}",
|
||||
"plain",
|
||||
)
|
||||
)
|
||||
except Exception as e:
|
||||
progress.console.print(f"Error retrieving MIME content: {e}")
|
||||
mime_msg.attach(MIMEText(f"Failed to retrieve message body: {str(e)}", 'plain'))
|
||||
mime_msg.attach(MIMEText(f"Failed to retrieve message body: {str(e)}", "plain"))
|
||||
|
||||
|
||||
async def add_attachments_async(mime_msg, message, headers, attachments_dir, progress):
|
||||
"""
|
||||
@@ -217,10 +289,12 @@ async def add_attachments_async(mime_msg, message, headers, attachments_dir, pro
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
message_id = message.get('id', '')
|
||||
message_id = message.get("id", "")
|
||||
|
||||
# Get attachments list
|
||||
attachments_url = f"https://graph.microsoft.com/v1.0/me/messages/{message_id}/attachments"
|
||||
attachments_url = (
|
||||
f"https://graph.microsoft.com/v1.0/me/messages/{message_id}/attachments"
|
||||
)
|
||||
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.get(attachments_url, headers=headers) as response:
|
||||
@@ -228,7 +302,7 @@ async def add_attachments_async(mime_msg, message, headers, attachments_dir, pro
|
||||
return
|
||||
|
||||
attachments_data = await response.json()
|
||||
attachments = attachments_data.get('value', [])
|
||||
attachments = attachments_data.get("value", [])
|
||||
|
||||
if not attachments:
|
||||
return
|
||||
@@ -238,33 +312,42 @@ async def add_attachments_async(mime_msg, message, headers, attachments_dir, pro
|
||||
ensure_directory_exists(message_attachments_dir)
|
||||
|
||||
# Add a header with attachment count
|
||||
mime_msg['X-Attachment-Count'] = str(len(attachments))
|
||||
mime_msg["X-Attachment-Count"] = str(len(attachments))
|
||||
|
||||
for idx, attachment in enumerate(attachments):
|
||||
attachment_name = safe_filename(attachment.get('name', 'attachment'))
|
||||
attachment_type = attachment.get('contentType', 'application/octet-stream')
|
||||
attachment_name = safe_filename(attachment.get("name", "attachment"))
|
||||
attachment_type = attachment.get(
|
||||
"contentType", "application/octet-stream"
|
||||
)
|
||||
|
||||
# Add attachment info to headers for reference
|
||||
mime_msg[f'X-Attachment-{idx+1}-Name'] = attachment_name
|
||||
mime_msg[f'X-Attachment-{idx+1}-Type'] = attachment_type
|
||||
mime_msg[f"X-Attachment-{idx + 1}-Name"] = attachment_name
|
||||
mime_msg[f"X-Attachment-{idx + 1}-Type"] = attachment_type
|
||||
|
||||
attachment_part = MIMEBase(*attachment_type.split('/', 1))
|
||||
attachment_part = MIMEBase(*attachment_type.split("/", 1))
|
||||
|
||||
# Get attachment content
|
||||
if 'contentBytes' in attachment:
|
||||
attachment_content = base64.b64decode(attachment['contentBytes'])
|
||||
if "contentBytes" in attachment:
|
||||
attachment_content = base64.b64decode(attachment["contentBytes"])
|
||||
|
||||
# Save attachment to disk
|
||||
attachment_path = os.path.join(message_attachments_dir, attachment_name)
|
||||
with open(attachment_path, 'wb') as f:
|
||||
attachment_path = os.path.join(
|
||||
message_attachments_dir, attachment_name
|
||||
)
|
||||
with open(attachment_path, "wb") as f:
|
||||
f.write(attachment_content)
|
||||
|
||||
# Add to MIME message
|
||||
attachment_part.set_payload(attachment_content)
|
||||
encoders.encode_base64(attachment_part)
|
||||
attachment_part.add_header('Content-Disposition', f'attachment; filename="{attachment_name}"')
|
||||
attachment_part.add_header(
|
||||
"Content-Disposition",
|
||||
f'attachment; filename="{attachment_name}"',
|
||||
)
|
||||
mime_msg.attach(attachment_part)
|
||||
|
||||
progress.console.print(f"Downloaded attachment: {attachment_name}")
|
||||
else:
|
||||
progress.console.print(f"Skipping attachment with no content: {attachment_name}")
|
||||
progress.console.print(
|
||||
f"Skipping attachment with no content: {attachment_name}"
|
||||
)
|
||||
|
||||
90
src/utils/notifications.py
Normal file
90
src/utils/notifications.py
Normal file
@@ -0,0 +1,90 @@
|
||||
"""
|
||||
macOS notification utilities for GTD terminal tools.
|
||||
"""
|
||||
|
||||
import subprocess
|
||||
import platform
|
||||
from typing import Optional
|
||||
|
||||
|
||||
def send_notification(
|
||||
title: str,
|
||||
message: str,
|
||||
subtitle: Optional[str] = None,
|
||||
sound: Optional[str] = None,
|
||||
) -> bool:
|
||||
"""
|
||||
Send a macOS notification using osascript.
|
||||
|
||||
Args:
|
||||
title: The notification title
|
||||
message: The notification message body
|
||||
subtitle: Optional subtitle
|
||||
sound: Optional sound name (e.g., "default", "Glass", "Ping")
|
||||
|
||||
Returns:
|
||||
bool: True if notification was sent successfully, False otherwise
|
||||
"""
|
||||
if platform.system() != "Darwin":
|
||||
return False
|
||||
|
||||
try:
|
||||
# Escape quotes for AppleScript string literals
|
||||
def escape_applescript_string(text: str) -> str:
|
||||
return text.replace("\\", "\\\\").replace('"', '\\"')
|
||||
|
||||
escaped_title = escape_applescript_string(title)
|
||||
escaped_message = escape_applescript_string(message)
|
||||
|
||||
# Build the AppleScript command
|
||||
script_parts = [
|
||||
f'display notification "{escaped_message}"',
|
||||
f'with title "{escaped_title}"',
|
||||
]
|
||||
|
||||
if subtitle:
|
||||
escaped_subtitle = escape_applescript_string(subtitle)
|
||||
script_parts.append(f'subtitle "{escaped_subtitle}"')
|
||||
|
||||
if sound:
|
||||
escaped_sound = escape_applescript_string(sound)
|
||||
script_parts.append(f'sound name "{escaped_sound}"')
|
||||
|
||||
script = " ".join(script_parts)
|
||||
|
||||
# Execute the notification by passing script through stdin
|
||||
subprocess.run(
|
||||
["osascript"], input=script, check=True, capture_output=True, text=True
|
||||
)
|
||||
return True
|
||||
except subprocess.CalledProcessError:
|
||||
return False
|
||||
except Exception:
|
||||
return False
|
||||
except Exception:
|
||||
return False
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
|
||||
def notify_new_emails(count: int, org: str = ""):
|
||||
"""
|
||||
Send notification about new email messages.
|
||||
|
||||
Args:
|
||||
count: Number of new messages
|
||||
org: Organization name (optional)
|
||||
"""
|
||||
if count <= 0:
|
||||
return
|
||||
|
||||
if count == 1:
|
||||
title = "New Email"
|
||||
message = "You have 1 new message"
|
||||
else:
|
||||
title = "New Emails"
|
||||
message = f"You have {count} new messages"
|
||||
|
||||
subtitle = f"from {org}" if org else None
|
||||
|
||||
send_notification(title=title, message=message, subtitle=subtitle, sound="default")
|
||||
352
src/utils/platform.py
Normal file
352
src/utils/platform.py
Normal file
@@ -0,0 +1,352 @@
|
||||
"""Cross-platform compatibility utilities."""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import platform
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import Optional, Dict, Any
|
||||
|
||||
|
||||
def get_platform_info() -> Dict[str, str]:
|
||||
"""Get platform information for compatibility checks."""
|
||||
return {
|
||||
"system": platform.system(),
|
||||
"release": platform.release(),
|
||||
"version": platform.version(),
|
||||
"machine": platform.machine(),
|
||||
"processor": platform.processor(),
|
||||
"python_version": platform.python_version(),
|
||||
"python_implementation": platform.python_implementation(),
|
||||
}
|
||||
|
||||
|
||||
def is_supported_platform() -> bool:
|
||||
"""Check if the current platform is supported."""
|
||||
system = platform.system()
|
||||
python_version = tuple(map(int, platform.python_version().split(".")))
|
||||
|
||||
# Check Python version
|
||||
if python_version < (3, 12):
|
||||
return False
|
||||
|
||||
# Check operating system
|
||||
supported_systems = ["Darwin", "Linux", "Windows"]
|
||||
return system in supported_systems
|
||||
|
||||
|
||||
def get_default_config_dir() -> Path:
|
||||
"""Get platform-specific config directory."""
|
||||
system = platform.system()
|
||||
|
||||
if system == "Darwin": # macOS
|
||||
return Path.home() / "Library" / "Application Support" / "luk"
|
||||
elif system == "Linux":
|
||||
config_dir = os.environ.get("XDG_CONFIG_HOME")
|
||||
if config_dir:
|
||||
return Path(config_dir) / "luk"
|
||||
return Path.home() / ".config" / "luk"
|
||||
elif system == "Windows":
|
||||
return Path(os.environ.get("APPDATA", "")) / "luk"
|
||||
else:
|
||||
# Fallback to ~/.config
|
||||
return Path.home() / ".config" / "luk"
|
||||
|
||||
|
||||
def get_default_data_dir() -> Path:
|
||||
"""Get platform-specific data directory."""
|
||||
system = platform.system()
|
||||
|
||||
if system == "Darwin": # macOS
|
||||
return Path.home() / "Library" / "Application Support" / "luk"
|
||||
elif system == "Linux":
|
||||
data_dir = os.environ.get("XDG_DATA_HOME")
|
||||
if data_dir:
|
||||
return Path(data_dir) / "luk"
|
||||
return Path.home() / ".local" / "share" / "luk"
|
||||
elif system == "Windows":
|
||||
return Path(os.environ.get("LOCALAPPDATA", "")) / "luk"
|
||||
else:
|
||||
# Fallback to ~/.local/share
|
||||
return Path.home() / ".local" / "share" / "luk"
|
||||
|
||||
|
||||
def get_default_log_dir() -> Path:
|
||||
"""Get platform-specific log directory."""
|
||||
system = platform.system()
|
||||
|
||||
if system == "Darwin": # macOS
|
||||
return Path.home() / "Library" / "Logs" / "luk"
|
||||
elif system == "Linux":
|
||||
data_dir = os.environ.get("XDG_DATA_HOME")
|
||||
if data_dir:
|
||||
return Path(data_dir) / "luk" / "logs"
|
||||
return Path.home() / ".local" / "share" / "luk" / "logs"
|
||||
elif system == "Windows":
|
||||
return Path(os.environ.get("LOCALAPPDATA", "")) / "luk" / "logs"
|
||||
else:
|
||||
# Fallback to ~/.local/share/logs
|
||||
return Path.home() / ".local" / "share" / "luk" / "logs"
|
||||
|
||||
|
||||
def get_default_maildir_path() -> Path:
|
||||
"""Get platform-specific default Maildir path."""
|
||||
system = platform.system()
|
||||
|
||||
if system == "Darwin": # macOS
|
||||
return Path.home() / "Library" / "Mail"
|
||||
elif system == "Linux":
|
||||
return Path.home() / "Mail"
|
||||
elif system == "Windows":
|
||||
return Path.home() / "Mail"
|
||||
else:
|
||||
return Path.home() / "Mail"
|
||||
|
||||
|
||||
def check_dependencies() -> Dict[str, bool]:
|
||||
"""Check if required system dependencies are available."""
|
||||
dependencies = {
|
||||
"python": True, # We're running Python
|
||||
"pip": False,
|
||||
"git": False,
|
||||
"curl": False,
|
||||
"wget": False,
|
||||
}
|
||||
|
||||
# Check for pip
|
||||
try:
|
||||
subprocess.run(["pip", "--version"], capture_output=True, check=True)
|
||||
dependencies["pip"] = True
|
||||
except (subprocess.CalledProcessError, FileNotFoundError):
|
||||
pass
|
||||
|
||||
# Check for git
|
||||
try:
|
||||
subprocess.run(["git", "--version"], capture_output=True, check=True)
|
||||
dependencies["git"] = True
|
||||
except (subprocess.CalledProcessError, FileNotFoundError):
|
||||
pass
|
||||
|
||||
# Check for curl
|
||||
try:
|
||||
subprocess.run(["curl", "--version"], capture_output=True, check=True)
|
||||
dependencies["curl"] = True
|
||||
except (subprocess.CalledProcessError, FileNotFoundError):
|
||||
pass
|
||||
|
||||
# Check for wget
|
||||
try:
|
||||
subprocess.run(["wget", "--version"], capture_output=True, check=True)
|
||||
dependencies["wget"] = True
|
||||
except (subprocess.CalledProcessError, FileNotFoundError):
|
||||
pass
|
||||
|
||||
return dependencies
|
||||
|
||||
|
||||
def get_shell_info() -> Dict[str, str]:
|
||||
"""Get shell information for completion setup."""
|
||||
shell = os.environ.get("SHELL", "")
|
||||
shell_name = Path(shell).name if shell else "unknown"
|
||||
|
||||
return {
|
||||
"shell_path": shell,
|
||||
"shell_name": shell_name,
|
||||
"config_file": get_shell_config_file(shell_name),
|
||||
}
|
||||
|
||||
|
||||
def get_shell_config_file(shell_name: str) -> str:
|
||||
"""Get the config file for a given shell."""
|
||||
shell_configs = {
|
||||
"bash": "~/.bashrc",
|
||||
"zsh": "~/.zshrc",
|
||||
"fish": "~/.config/fish/config.fish",
|
||||
"ksh": "~/.kshrc",
|
||||
"csh": "~/.cshrc",
|
||||
"tcsh": "~/.tcshrc",
|
||||
}
|
||||
|
||||
return shell_configs.get(shell_name, "~/.profile")
|
||||
|
||||
|
||||
def setup_platform_specific() -> None:
|
||||
"""Setup platform-specific configurations."""
|
||||
system = platform.system()
|
||||
|
||||
if system == "Darwin":
|
||||
setup_macos()
|
||||
elif system == "Linux":
|
||||
setup_linux()
|
||||
elif system == "Windows":
|
||||
setup_windows()
|
||||
|
||||
|
||||
def setup_macos() -> None:
|
||||
"""Setup macOS-specific configurations."""
|
||||
# Ensure macOS-specific directories exist
|
||||
config_dir = get_default_config_dir()
|
||||
data_dir = get_default_data_dir()
|
||||
log_dir = get_default_log_dir()
|
||||
|
||||
for directory in [config_dir, data_dir, log_dir]:
|
||||
directory.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
def setup_linux() -> None:
|
||||
"""Setup Linux-specific configurations."""
|
||||
# Ensure XDG directories exist
|
||||
config_dir = get_default_config_dir()
|
||||
data_dir = get_default_data_dir()
|
||||
log_dir = get_default_log_dir()
|
||||
|
||||
for directory in [config_dir, data_dir, log_dir]:
|
||||
directory.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
def setup_windows() -> None:
|
||||
"""Setup Windows-specific configurations."""
|
||||
# Ensure Windows-specific directories exist
|
||||
config_dir = get_default_config_dir()
|
||||
data_dir = get_default_data_dir()
|
||||
log_dir = get_default_log_dir()
|
||||
|
||||
for directory in [config_dir, data_dir, log_dir]:
|
||||
directory.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
def get_platform_specific_commands() -> Dict[str, str]:
|
||||
"""Get platform-specific command equivalents."""
|
||||
system = platform.system()
|
||||
|
||||
if system == "Darwin" or system == "Linux":
|
||||
return {
|
||||
"open": "open" if system == "Darwin" else "xdg-open",
|
||||
"copy": "pbcopy" if system == "Darwin" else "xclip -selection clipboard",
|
||||
"notify": "osascript -e 'display notification \"%s\"'"
|
||||
if system == "Darwin"
|
||||
else 'notify-send "%s"',
|
||||
}
|
||||
elif system == "Windows":
|
||||
return {
|
||||
"open": "start",
|
||||
"copy": "clip",
|
||||
"notify": "powershell -Command \"Add-Type -AssemblyName System.Windows.Forms; [System.Windows.Forms.MessageBox]::Show('%s')\"",
|
||||
}
|
||||
else:
|
||||
return {}
|
||||
|
||||
|
||||
def check_terminal_compatibility() -> Dict[str, bool]:
|
||||
"""Check terminal compatibility for TUI features."""
|
||||
return {
|
||||
"color_support": sys.stdout.isatty(),
|
||||
"unicode_support": True, # Most modern terminals support Unicode
|
||||
"mouse_support": check_mouse_support(),
|
||||
"textual_support": check_textual_support(),
|
||||
}
|
||||
|
||||
|
||||
def check_mouse_support() -> bool:
|
||||
"""Check if terminal supports mouse events."""
|
||||
# This is a basic check - actual mouse support depends on the terminal
|
||||
return sys.stdout.isatty()
|
||||
|
||||
|
||||
def check_textual_support() -> bool:
|
||||
"""Check if Textual TUI framework can run."""
|
||||
try:
|
||||
import textual
|
||||
|
||||
return True
|
||||
except ImportError:
|
||||
return False
|
||||
|
||||
|
||||
def get_platform_recommendations() -> list[str]:
|
||||
"""Get platform-specific recommendations."""
|
||||
system = platform.system()
|
||||
recommendations = []
|
||||
|
||||
if system == "Darwin":
|
||||
recommendations.extend(
|
||||
[
|
||||
"Install iTerm2 or Terminal.app for best TUI experience",
|
||||
"Enable 'Terminal > Preferences > Profiles > Text > Unicode Normalization Form' set to 'None'",
|
||||
"Consider using Homebrew for package management: brew install python3",
|
||||
]
|
||||
)
|
||||
elif system == "Linux":
|
||||
recommendations.extend(
|
||||
[
|
||||
"Use a modern terminal emulator like GNOME Terminal, Konsole, or Alacritty",
|
||||
"Ensure UTF-8 locale is set: export LANG=en_US.UTF-8",
|
||||
"Install system packages: sudo apt-get install python3-pip python3-venv",
|
||||
]
|
||||
)
|
||||
elif system == "Windows":
|
||||
recommendations.extend(
|
||||
[
|
||||
"Use Windows Terminal for best TUI experience",
|
||||
"Enable UTF-8 support in Windows Terminal settings",
|
||||
"Consider using WSL2 for better Unix compatibility",
|
||||
"Install Python from python.org or Microsoft Store",
|
||||
]
|
||||
)
|
||||
|
||||
return recommendations
|
||||
|
||||
|
||||
def validate_environment() -> Dict[str, Any]:
|
||||
"""Validate the current environment for compatibility."""
|
||||
platform_info = get_platform_info()
|
||||
dependencies = check_dependencies()
|
||||
terminal_compat = check_terminal_compatibility()
|
||||
|
||||
return {
|
||||
"platform_supported": is_supported_platform(),
|
||||
"platform_info": platform_info,
|
||||
"dependencies": dependencies,
|
||||
"terminal_compatibility": terminal_compat,
|
||||
"recommendations": get_platform_recommendations(),
|
||||
"config_dir": str(get_default_config_dir()),
|
||||
"data_dir": str(get_default_data_dir()),
|
||||
"log_dir": str(get_default_log_dir()),
|
||||
}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Print environment validation
|
||||
env_info = validate_environment()
|
||||
|
||||
print("Platform Compatibility Check")
|
||||
print("=" * 40)
|
||||
print(
|
||||
f"Platform: {env_info['platform_info']['system']} {env_info['platform_info']['release']}"
|
||||
)
|
||||
print(
|
||||
f"Python: {env_info['platform_info']['python_version']} ({env_info['platform_info']['python_implementation']})"
|
||||
)
|
||||
print(f"Supported: {'✓' if env_info['platform_supported'] else '✗'}")
|
||||
print()
|
||||
|
||||
print("Dependencies:")
|
||||
for dep, available in env_info["dependencies"].items():
|
||||
print(f" {dep}: {'✓' if available else '✗'}")
|
||||
print()
|
||||
|
||||
print("Terminal Compatibility:")
|
||||
for feature, supported in env_info["terminal_compatibility"].items():
|
||||
print(f" {feature}: {'✓' if supported else '✗'}")
|
||||
print()
|
||||
|
||||
print("Directories:")
|
||||
print(f" Config: {env_info['config_dir']}")
|
||||
print(f" Data: {env_info['data_dir']}")
|
||||
print(f" Logs: {env_info['log_dir']}")
|
||||
print()
|
||||
|
||||
if env_info["recommendations"]:
|
||||
print("Recommendations:")
|
||||
for rec in env_info["recommendations"]:
|
||||
print(f" • {rec}")
|
||||
284
src/utils/ticktick_utils.py
Normal file
284
src/utils/ticktick_utils.py
Normal file
@@ -0,0 +1,284 @@
|
||||
"""
|
||||
TickTick utilities for formatting and helper functions.
|
||||
"""
|
||||
|
||||
import platform
|
||||
import subprocess
|
||||
import webbrowser
|
||||
from datetime import datetime, date, timedelta
|
||||
from typing import Dict, Any, List, Optional
|
||||
from rich.console import Console
|
||||
from rich.table import Table
|
||||
from rich.text import Text
|
||||
from dateutil import parser as date_parser
|
||||
|
||||
console = Console()
|
||||
|
||||
|
||||
def format_date(date_str: Optional[str]) -> str:
|
||||
"""
|
||||
Format a date string for display.
|
||||
|
||||
Args:
|
||||
date_str: ISO date string
|
||||
|
||||
Returns:
|
||||
Formatted date string
|
||||
"""
|
||||
if not date_str:
|
||||
return ""
|
||||
|
||||
try:
|
||||
dt = date_parser.parse(date_str)
|
||||
today = datetime.now().date()
|
||||
task_date = dt.date()
|
||||
|
||||
if task_date == today:
|
||||
return "Today"
|
||||
elif task_date == today + timedelta(days=1):
|
||||
return "Tomorrow"
|
||||
elif task_date == today - timedelta(days=1):
|
||||
return "Yesterday"
|
||||
else:
|
||||
return dt.strftime("%Y-%m-%d")
|
||||
except (ValueError, TypeError):
|
||||
return str(date_str)
|
||||
|
||||
|
||||
def get_priority_display(priority: int) -> Text:
|
||||
"""
|
||||
Get a rich Text object for priority display.
|
||||
|
||||
Args:
|
||||
priority: Priority level (0-5)
|
||||
|
||||
Returns:
|
||||
Rich Text object with colored priority
|
||||
"""
|
||||
if priority == 0:
|
||||
return Text("", style="dim")
|
||||
elif priority == 1:
|
||||
return Text("!", style="blue")
|
||||
elif priority == 2:
|
||||
return Text("!!", style="yellow")
|
||||
elif priority >= 3:
|
||||
return Text("!!!", style="red bold")
|
||||
else:
|
||||
return Text("", style="dim")
|
||||
|
||||
|
||||
def format_task_title(task: Dict[str, Any], max_length: int = 50) -> str:
|
||||
"""
|
||||
Format task title with truncation if needed.
|
||||
|
||||
Args:
|
||||
task: Task dictionary
|
||||
max_length: Maximum length for title
|
||||
|
||||
Returns:
|
||||
Formatted title string
|
||||
"""
|
||||
title = task.get("title", "Untitled")
|
||||
if len(title) > max_length:
|
||||
return title[: max_length - 3] + "..."
|
||||
return title
|
||||
|
||||
|
||||
def create_task_table(tasks: List[Dict[str, Any]], show_project: bool = True) -> Table:
|
||||
"""
|
||||
Create a rich table for displaying tasks.
|
||||
|
||||
Args:
|
||||
tasks: List of task dictionaries
|
||||
show_project: Whether to show project column
|
||||
|
||||
Returns:
|
||||
Rich Table object
|
||||
"""
|
||||
table = Table(show_header=True, header_style="bold magenta")
|
||||
|
||||
table.add_column("ID", style="dim", width=8)
|
||||
table.add_column("Priority", width=8)
|
||||
table.add_column("Title", style="white", min_width=30)
|
||||
if show_project:
|
||||
table.add_column("Project", style="cyan", width=15)
|
||||
table.add_column("Due Date", style="yellow", width=12)
|
||||
table.add_column("Tags", style="green", width=20)
|
||||
|
||||
for task in tasks:
|
||||
task_id = str(task.get("id", ""))[:8]
|
||||
priority_text = get_priority_display(task.get("priority", 0))
|
||||
title = format_task_title(task)
|
||||
due_date = format_date(task.get("dueDate"))
|
||||
|
||||
# Get project name (would need to be looked up from projects)
|
||||
project_name = task.get("projectId", "Inbox")[:15] if show_project else None
|
||||
|
||||
# Format tags
|
||||
tags_list = task.get("tags", [])
|
||||
if isinstance(tags_list, list):
|
||||
tags = ", ".join(tags_list[:3]) # Show max 3 tags
|
||||
if len(tags_list) > 3:
|
||||
tags += f" (+{len(tags_list) - 3})"
|
||||
else:
|
||||
tags = ""
|
||||
|
||||
if show_project:
|
||||
table.add_row(task_id, priority_text, title, project_name, due_date, tags)
|
||||
else:
|
||||
table.add_row(task_id, priority_text, title, due_date, tags)
|
||||
|
||||
return table
|
||||
|
||||
|
||||
def print_task_details(task: Dict[str, Any]):
|
||||
"""
|
||||
Print detailed view of a single task.
|
||||
|
||||
Args:
|
||||
task: Task dictionary
|
||||
"""
|
||||
console.print(f"[bold cyan]Task Details[/bold cyan]")
|
||||
console.print(f"ID: {task.get('id', 'N/A')}")
|
||||
console.print(f"Title: [white]{task.get('title', 'Untitled')}[/white]")
|
||||
|
||||
if task.get("content"):
|
||||
console.print(f"Description: {task.get('content')}")
|
||||
|
||||
console.print(f"Priority: {get_priority_display(task.get('priority', 0))}")
|
||||
console.print(f"Project ID: {task.get('projectId', 'N/A')}")
|
||||
|
||||
if task.get("dueDate"):
|
||||
console.print(f"Due Date: [yellow]{format_date(task.get('dueDate'))}[/yellow]")
|
||||
|
||||
if task.get("tags"):
|
||||
console.print(f"Tags: [green]{', '.join(task.get('tags', []))}[/green]")
|
||||
|
||||
console.print(f"Status: {'Completed' if task.get('status') == 2 else 'Open'}")
|
||||
|
||||
if task.get("createdTime"):
|
||||
console.print(f"Created: {format_date(task.get('createdTime'))}")
|
||||
|
||||
if task.get("modifiedTime"):
|
||||
console.print(f"Modified: {format_date(task.get('modifiedTime'))}")
|
||||
|
||||
|
||||
def open_task_in_browser(task_id: str):
|
||||
"""
|
||||
Open a task in the default web browser.
|
||||
|
||||
Args:
|
||||
task_id: Task ID to open
|
||||
"""
|
||||
url = f"https://ticktick.com/webapp/#q/all/tasks/{task_id}"
|
||||
webbrowser.open(url)
|
||||
console.print(f"[green]Opened task in browser: {url}[/green]")
|
||||
|
||||
|
||||
def open_task_in_macos_app(task_id: str) -> bool:
|
||||
"""
|
||||
Open a task in the TickTick macOS app.
|
||||
|
||||
Args:
|
||||
task_id: Task ID to open
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
if platform.system() != "Darwin":
|
||||
console.print("[red]macOS app opening is only available on macOS[/red]")
|
||||
return False
|
||||
|
||||
try:
|
||||
# Try to open with TickTick URL scheme
|
||||
ticktick_url = f"ticktick://task/{task_id}"
|
||||
result = subprocess.run(
|
||||
["open", ticktick_url], capture_output=True, text=True, timeout=5
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
console.print(f"[green]Opened task in TickTick app[/green]")
|
||||
return True
|
||||
else:
|
||||
console.print(
|
||||
"[yellow]TickTick app not found, opening in browser instead[/yellow]"
|
||||
)
|
||||
open_task_in_browser(task_id)
|
||||
return False
|
||||
|
||||
except (subprocess.TimeoutExpired, subprocess.SubprocessError, FileNotFoundError):
|
||||
console.print(
|
||||
"[yellow]Failed to open TickTick app, opening in browser instead[/yellow]"
|
||||
)
|
||||
open_task_in_browser(task_id)
|
||||
return False
|
||||
|
||||
|
||||
def open_task(task_id: str, prefer_app: bool = True):
|
||||
"""
|
||||
Open a task in browser or app based on preference.
|
||||
|
||||
Args:
|
||||
task_id: Task ID to open
|
||||
prefer_app: Whether to prefer native app over browser
|
||||
"""
|
||||
if prefer_app and platform.system() == "Darwin":
|
||||
if not open_task_in_macos_app(task_id):
|
||||
open_task_in_browser(task_id)
|
||||
else:
|
||||
open_task_in_browser(task_id)
|
||||
|
||||
|
||||
def parse_priority(priority_str: str) -> int:
|
||||
"""
|
||||
Parse priority string to integer.
|
||||
|
||||
Args:
|
||||
priority_str: Priority as string (low, medium, high, none, or 0-5)
|
||||
|
||||
Returns:
|
||||
Priority integer (0-5)
|
||||
"""
|
||||
if not priority_str:
|
||||
return 0
|
||||
|
||||
priority_str = priority_str.lower().strip()
|
||||
|
||||
if priority_str in ["none", "no", "0"]:
|
||||
return 0
|
||||
elif priority_str in ["low", "1"]:
|
||||
return 1
|
||||
elif priority_str in ["medium", "med", "2"]:
|
||||
return 2
|
||||
elif priority_str in ["high", "3"]:
|
||||
return 3
|
||||
elif priority_str in ["very high", "urgent", "4"]:
|
||||
return 4
|
||||
elif priority_str in ["critical", "5"]:
|
||||
return 5
|
||||
else:
|
||||
try:
|
||||
priority = int(priority_str)
|
||||
return max(0, min(5, priority))
|
||||
except ValueError:
|
||||
return 0
|
||||
|
||||
|
||||
def validate_date(date_str: str) -> bool:
|
||||
"""
|
||||
Validate if a date string is parseable.
|
||||
|
||||
Args:
|
||||
date_str: Date string to validate
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
if date_str.lower() in ["today", "tomorrow", "yesterday"]:
|
||||
return True
|
||||
|
||||
try:
|
||||
date_parser.parse(date_str)
|
||||
return True
|
||||
except ValueError:
|
||||
return False
|
||||
381
sweep_tasks.py
Executable file
381
sweep_tasks.py
Executable file
@@ -0,0 +1,381 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Godspeed Task Sweeper - Consolidate incomplete tasks from markdown files.
|
||||
|
||||
This script recursively searches through directories (like 2024/, 2025/, etc.)
|
||||
and moves all incomplete tasks from markdown files into the Godspeed Inbox.md file.
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import re
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
from typing import List, Tuple, Set
|
||||
from datetime import datetime
|
||||
|
||||
|
||||
class TaskSweeper:
|
||||
"""Sweeps incomplete tasks from markdown files into Godspeed Inbox."""
|
||||
|
||||
def __init__(self, notes_dir: Path, godspeed_dir: Path, dry_run: bool = False):
|
||||
self.notes_dir = Path(notes_dir)
|
||||
self.godspeed_dir = Path(godspeed_dir)
|
||||
self.dry_run = dry_run
|
||||
self.inbox_file = self.godspeed_dir / "Inbox.md"
|
||||
|
||||
# Import the sync engine for consistent parsing
|
||||
try:
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
self.sync_engine = GodspeedSync(None, godspeed_dir)
|
||||
except ImportError:
|
||||
# Fallback parsing if import fails
|
||||
self.sync_engine = None
|
||||
|
||||
def _parse_task_line_fallback(self, line: str) -> Tuple[str, str, str, str]:
|
||||
"""Fallback task parsing if sync engine not available."""
|
||||
# Match patterns like: - [ ] Task title <!-- id:abc123 -->
|
||||
task_pattern = (
|
||||
r"^\s*-\s*\[([xX\s\-])\]\s*(.+?)(?:\s*<!--\s*id:(\w+)\s*-->)?\s*$"
|
||||
)
|
||||
match = re.match(task_pattern, line.strip())
|
||||
|
||||
if not match:
|
||||
return None
|
||||
|
||||
checkbox, title_and_notes, local_id = match.groups()
|
||||
|
||||
# Determine status
|
||||
if checkbox.lower() == "x":
|
||||
status = "complete"
|
||||
elif checkbox == "-":
|
||||
status = "cleared"
|
||||
else:
|
||||
status = "incomplete"
|
||||
|
||||
# Extract title (remove any inline notes after <!--)
|
||||
title = title_and_notes.split("<!--")[0].strip()
|
||||
|
||||
# Generate ID if missing
|
||||
if not local_id:
|
||||
import uuid
|
||||
|
||||
local_id = str(uuid.uuid4())[:8]
|
||||
|
||||
return local_id, status, title, ""
|
||||
|
||||
def _parse_markdown_file(self, file_path: Path) -> Tuple[List[Tuple], List[str]]:
|
||||
"""Parse a markdown file and extract tasks and non-task content."""
|
||||
if not file_path.exists():
|
||||
return [], []
|
||||
|
||||
tasks = []
|
||||
non_task_lines = []
|
||||
|
||||
try:
|
||||
with open(file_path, "r", encoding="utf-8") as f:
|
||||
lines = f.readlines()
|
||||
except Exception as e:
|
||||
print(f" ⚠️ Error reading {file_path}: {e}")
|
||||
return [], []
|
||||
|
||||
i = 0
|
||||
while i < len(lines):
|
||||
line = lines[i].rstrip()
|
||||
|
||||
# Check if this line looks like a task
|
||||
if line.strip().startswith("- ["):
|
||||
# Try to parse with sync engine first
|
||||
if self.sync_engine:
|
||||
# Collect potential multi-line task
|
||||
task_block = [line]
|
||||
j = i + 1
|
||||
while (
|
||||
j < len(lines)
|
||||
and lines[j].strip()
|
||||
and not lines[j].strip().startswith("- [")
|
||||
):
|
||||
task_block.append(lines[j].rstrip())
|
||||
j += 1
|
||||
|
||||
task_text = "\n".join(task_block)
|
||||
parsed = self.sync_engine._parse_task_line(task_text)
|
||||
|
||||
if parsed:
|
||||
tasks.append(parsed)
|
||||
i = j # Skip the lines we've processed
|
||||
continue
|
||||
|
||||
# Fallback parsing
|
||||
parsed = self._parse_task_line_fallback(line)
|
||||
if parsed:
|
||||
tasks.append(parsed)
|
||||
i += 1
|
||||
continue
|
||||
|
||||
# Not a task, keep as regular content
|
||||
non_task_lines.append(line)
|
||||
i += 1
|
||||
|
||||
return tasks, non_task_lines
|
||||
|
||||
def _write_tasks_to_file(self, file_path: Path, tasks: List[Tuple]):
|
||||
"""Write tasks to a markdown file."""
|
||||
if not tasks:
|
||||
return
|
||||
|
||||
file_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Read existing content if file exists
|
||||
existing_content = ""
|
||||
if file_path.exists():
|
||||
with open(file_path, "r", encoding="utf-8") as f:
|
||||
existing_content = f.read()
|
||||
|
||||
# Format new tasks
|
||||
new_task_lines = []
|
||||
for local_id, status, title, notes in tasks:
|
||||
if self.sync_engine:
|
||||
formatted = self.sync_engine._format_task_line(
|
||||
local_id, status, title, notes
|
||||
)
|
||||
else:
|
||||
# Fallback formatting
|
||||
checkbox = {"incomplete": "[ ]", "complete": "[x]", "cleared": "[-]"}[
|
||||
status
|
||||
]
|
||||
formatted = f"- {checkbox} {title} <!-- id:{local_id} -->"
|
||||
if notes:
|
||||
formatted += f"\n {notes}"
|
||||
|
||||
new_task_lines.append(formatted)
|
||||
|
||||
# Combine with existing content
|
||||
if existing_content.strip():
|
||||
new_content = (
|
||||
existing_content.rstrip() + "\n\n" + "\n".join(new_task_lines) + "\n"
|
||||
)
|
||||
else:
|
||||
new_content = "\n".join(new_task_lines) + "\n"
|
||||
|
||||
with open(file_path, "w", encoding="utf-8") as f:
|
||||
f.write(new_content)
|
||||
|
||||
def _clean_file(self, file_path: Path, non_task_lines: List[str]):
|
||||
"""Remove tasks from original file, keeping only non-task content."""
|
||||
if not non_task_lines or all(not line.strip() for line in non_task_lines):
|
||||
# File would be empty, delete it
|
||||
if not self.dry_run:
|
||||
file_path.unlink()
|
||||
print(f" 🗑️ Would delete empty file: {file_path}")
|
||||
else:
|
||||
# Write back non-task content
|
||||
cleaned_content = "\n".join(non_task_lines).strip()
|
||||
if cleaned_content:
|
||||
cleaned_content += "\n"
|
||||
|
||||
if not self.dry_run:
|
||||
with open(file_path, "w", encoding="utf-8") as f:
|
||||
f.write(cleaned_content)
|
||||
print(f" ✂️ Cleaned file (removed tasks): {file_path}")
|
||||
|
||||
def find_markdown_files(self) -> List[Path]:
|
||||
"""Find all markdown files in the notes directory, excluding Godspeed directory."""
|
||||
markdown_files = []
|
||||
|
||||
for md_file in self.notes_dir.rglob("*.md"):
|
||||
# Skip files in the Godspeed directory
|
||||
if (
|
||||
self.godspeed_dir in md_file.parents
|
||||
or md_file.parent == self.godspeed_dir
|
||||
):
|
||||
continue
|
||||
|
||||
# Skip hidden files and directories
|
||||
if any(part.startswith(".") for part in md_file.parts):
|
||||
continue
|
||||
|
||||
markdown_files.append(md_file)
|
||||
|
||||
return sorted(markdown_files)
|
||||
|
||||
def sweep_tasks(self) -> dict:
|
||||
"""Sweep incomplete tasks from all markdown files into Inbox."""
|
||||
print(f"🧹 Sweeping incomplete tasks from: {self.notes_dir}")
|
||||
print(f"📥 Target Inbox: {self.inbox_file}")
|
||||
print(f"🔍 Dry run: {self.dry_run}")
|
||||
print("=" * 60)
|
||||
|
||||
markdown_files = self.find_markdown_files()
|
||||
print(f"\n📁 Found {len(markdown_files)} markdown files to process")
|
||||
|
||||
swept_tasks = []
|
||||
processed_files = []
|
||||
empty_files_deleted = []
|
||||
|
||||
for file_path in markdown_files:
|
||||
rel_path = file_path.relative_to(self.notes_dir)
|
||||
print(f"\n📄 Processing: {rel_path}")
|
||||
|
||||
tasks, non_task_lines = self._parse_markdown_file(file_path)
|
||||
if not tasks:
|
||||
print(f" ℹ️ No tasks found")
|
||||
continue
|
||||
|
||||
# Separate incomplete tasks from completed/cleared ones
|
||||
incomplete_tasks = []
|
||||
complete_tasks = []
|
||||
|
||||
for task in tasks:
|
||||
local_id, status, title, notes = task
|
||||
if status == "incomplete":
|
||||
incomplete_tasks.append(task)
|
||||
else:
|
||||
complete_tasks.append(task)
|
||||
|
||||
if incomplete_tasks:
|
||||
print(f" 🔄 Found {len(incomplete_tasks)} incomplete tasks:")
|
||||
for _, status, title, notes in incomplete_tasks:
|
||||
print(f" • {title}")
|
||||
if notes:
|
||||
print(f" Notes: {notes}")
|
||||
|
||||
# Add source file annotation
|
||||
source_annotation = f"<!-- Swept from {rel_path} on {datetime.now().strftime('%Y-%m-%d %H:%M')} -->"
|
||||
annotated_tasks = []
|
||||
for local_id, status, title, notes in incomplete_tasks:
|
||||
# Add source info to notes
|
||||
source_notes = f"From: {rel_path}"
|
||||
if notes:
|
||||
combined_notes = f"{notes}\n{source_notes}"
|
||||
else:
|
||||
combined_notes = source_notes
|
||||
annotated_tasks.append((local_id, status, title, combined_notes))
|
||||
|
||||
swept_tasks.extend(annotated_tasks)
|
||||
processed_files.append(str(rel_path))
|
||||
|
||||
if complete_tasks:
|
||||
print(
|
||||
f" ✅ Keeping {len(complete_tasks)} completed/cleared tasks in place"
|
||||
)
|
||||
|
||||
# Reconstruct remaining content (non-tasks + completed tasks)
|
||||
remaining_content = non_task_lines.copy()
|
||||
|
||||
# Add completed/cleared tasks back to remaining content
|
||||
if complete_tasks:
|
||||
remaining_content.append("") # Empty line before tasks
|
||||
for task in complete_tasks:
|
||||
if self.sync_engine:
|
||||
formatted = self.sync_engine._format_task_line(*task)
|
||||
else:
|
||||
local_id, status, title, notes = task
|
||||
checkbox = {
|
||||
"incomplete": "[ ]",
|
||||
"complete": "[x]",
|
||||
"cleared": "[-]",
|
||||
}[status]
|
||||
formatted = f"- {checkbox} {title} <!-- id:{local_id} -->"
|
||||
if notes:
|
||||
formatted += f"\n {notes}"
|
||||
remaining_content.append(formatted)
|
||||
|
||||
# Clean the original file
|
||||
if incomplete_tasks:
|
||||
self._clean_file(file_path, remaining_content)
|
||||
|
||||
# Write swept tasks to Inbox
|
||||
if swept_tasks:
|
||||
print(f"\n📥 Writing {len(swept_tasks)} tasks to Inbox...")
|
||||
if not self.dry_run:
|
||||
self._write_tasks_to_file(self.inbox_file, swept_tasks)
|
||||
print(f" ✅ Inbox updated: {self.inbox_file}")
|
||||
|
||||
# Summary
|
||||
print(f"\n" + "=" * 60)
|
||||
print(f"📊 SWEEP SUMMARY:")
|
||||
print(f" • Files processed: {len(processed_files)}")
|
||||
print(f" • Tasks swept: {len(swept_tasks)}")
|
||||
print(f" • Target: {self.inbox_file}")
|
||||
|
||||
if self.dry_run:
|
||||
print(f"\n⚠️ DRY RUN - No files were actually modified")
|
||||
print(f" Run without --dry-run to perform the sweep")
|
||||
|
||||
return {
|
||||
"swept_tasks": len(swept_tasks),
|
||||
"processed_files": processed_files,
|
||||
"inbox_file": str(self.inbox_file),
|
||||
}
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Sweep incomplete tasks from markdown files into Godspeed Inbox",
|
||||
epilog="""
|
||||
Examples:
|
||||
python sweep_tasks.py ~/Documents/Notes ~/Documents/Godspeed
|
||||
python sweep_tasks.py . ./godspeed --dry-run
|
||||
python sweep_tasks.py ~/Notes ~/Notes/godspeed --dry-run
|
||||
""",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"notes_dir",
|
||||
type=Path,
|
||||
help="Root directory containing markdown files with tasks (e.g., ~/Documents/Notes)",
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"godspeed_dir",
|
||||
type=Path,
|
||||
help="Godspeed sync directory where Inbox.md will be created (e.g., ~/Documents/Godspeed)",
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"--dry-run",
|
||||
action="store_true",
|
||||
help="Show what would be done without making changes",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Validate directories
|
||||
if not args.notes_dir.exists():
|
||||
print(f"❌ Notes directory does not exist: {args.notes_dir}")
|
||||
return 1
|
||||
|
||||
if not args.notes_dir.is_dir():
|
||||
print(f"❌ Notes path is not a directory: {args.notes_dir}")
|
||||
return 1
|
||||
|
||||
# Godspeed directory will be created if it doesn't exist
|
||||
|
||||
try:
|
||||
sweeper = TaskSweeper(args.notes_dir, args.godspeed_dir, args.dry_run)
|
||||
result = sweeper.sweep_tasks()
|
||||
|
||||
if result["swept_tasks"] > 0:
|
||||
print(f"\n🎉 Successfully swept {result['swept_tasks']} tasks!")
|
||||
if not args.dry_run:
|
||||
print(f"💡 Next steps:")
|
||||
print(f" 1. Review tasks in: {result['inbox_file']}")
|
||||
print(f" 2. Run 'godspeed upload' to sync to API")
|
||||
print(f" 3. Organize tasks into appropriate lists in Godspeed app")
|
||||
else:
|
||||
print(f"\n✨ No incomplete tasks found to sweep.")
|
||||
|
||||
return 0
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Error during sweep: {e}")
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
return 1
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
exit(main())
|
||||
60
test_aerc_integration.sh
Executable file
60
test_aerc_integration.sh
Executable file
@@ -0,0 +1,60 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Test script to demonstrate aerc integration with the sendmail wrapper
|
||||
# This shows how aerc would interact with our email sending system
|
||||
|
||||
echo "=== Testing Email Sending Daemon ==="
|
||||
echo
|
||||
|
||||
# Get the full path to our sendmail wrapper
|
||||
SENDMAIL_PATH="$(pwd)/sendmail"
|
||||
echo "Sendmail wrapper: $SENDMAIL_PATH"
|
||||
echo
|
||||
|
||||
# Show current queue status
|
||||
echo "Current outbox queue:"
|
||||
find ~/Mail/*/outbox/new -type f 2>/dev/null | wc -l | xargs echo "Pending emails:"
|
||||
echo
|
||||
|
||||
# Create a test email that aerc might send
|
||||
echo "Creating test email as aerc would..."
|
||||
cat << 'EOF' | $SENDMAIL_PATH
|
||||
From: user@corteva.com
|
||||
To: colleague@example.com
|
||||
Cc: team@example.com
|
||||
Subject: Project Update from aerc
|
||||
|
||||
Hi team,
|
||||
|
||||
This email was composed in aerc using helix editor and queued
|
||||
for sending through our Microsoft Graph adapter.
|
||||
|
||||
The email will be sent when the sync daemon processes the outbox.
|
||||
|
||||
Best regards,
|
||||
User
|
||||
EOF
|
||||
|
||||
echo "Email queued successfully!"
|
||||
echo
|
||||
|
||||
# Show updated queue status
|
||||
echo "Updated outbox queue:"
|
||||
find ~/Mail/*/outbox/new -type f 2>/dev/null | wc -l | xargs echo "Pending emails:"
|
||||
echo
|
||||
|
||||
echo "To process the queue, run:"
|
||||
echo " python3 -m src.cli sync --daemon --notify"
|
||||
echo
|
||||
echo "Or for a one-time sync:"
|
||||
echo " python3 -m src.cli sync --dry-run"
|
||||
echo
|
||||
|
||||
echo "=== aerc Configuration ==="
|
||||
echo "Add this to your aerc config:"
|
||||
echo
|
||||
echo "[outgoing]"
|
||||
echo "sendmail = $SENDMAIL_PATH"
|
||||
echo
|
||||
echo "Then compose emails in aerc as usual - they'll be queued offline"
|
||||
echo "and sent when you run the sync daemon."
|
||||
95
test_cancelled_tasks.py
Normal file
95
test_cancelled_tasks.py
Normal file
@@ -0,0 +1,95 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Test script for Godspeed cancelled task functionality.
|
||||
"""
|
||||
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def test_cancelled_task_parsing():
|
||||
print("=== Testing Cancelled Task Support ===\n")
|
||||
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
sync_dir = Path(temp_dir)
|
||||
sync_engine = GodspeedSync(None, sync_dir)
|
||||
|
||||
print("1. Testing task status parsing:")
|
||||
print("-" * 40)
|
||||
|
||||
test_lines = [
|
||||
"- [ ] Incomplete task <!-- id:abc123 -->",
|
||||
"- [x] Completed task <!-- id:def456 -->",
|
||||
"- [X] Also completed <!-- id:ghi789 -->",
|
||||
"- [-] Cancelled task <!-- id:jkl012 -->",
|
||||
]
|
||||
|
||||
for line in test_lines:
|
||||
parsed = sync_engine._parse_task_line(line)
|
||||
if parsed:
|
||||
local_id, status, title, notes = parsed
|
||||
icon = {"incomplete": "⏳", "complete": "✅", "cleared": "❌"}[status]
|
||||
print(f" {icon} {status.upper()}: {title} (ID: {local_id})")
|
||||
else:
|
||||
print(f" ❌ Failed to parse: {line}")
|
||||
|
||||
print("\n2. Testing task formatting:")
|
||||
print("-" * 30)
|
||||
|
||||
tasks = [
|
||||
("task1", "incomplete", "Buy groceries", ""),
|
||||
("task2", "complete", "Call dentist", ""),
|
||||
("task3", "cleared", "Old project", "No longer needed"),
|
||||
]
|
||||
|
||||
for local_id, status, title, notes in tasks:
|
||||
formatted = sync_engine._format_task_line(local_id, status, title, notes)
|
||||
print(f" {formatted}")
|
||||
|
||||
print("\n3. Testing roundtrip with all statuses:")
|
||||
print("-" * 42)
|
||||
|
||||
# Write to file
|
||||
test_file = sync_dir / "test_statuses.md"
|
||||
sync_engine._write_list_file(test_file, tasks)
|
||||
|
||||
# Read back
|
||||
read_tasks = sync_engine._read_list_file(test_file)
|
||||
|
||||
print(f" Original: {len(tasks)} tasks")
|
||||
print(f" Read back: {len(read_tasks)} tasks")
|
||||
|
||||
for original, read_back in zip(tasks, read_tasks):
|
||||
orig_id, orig_status, orig_title, orig_notes = original
|
||||
read_id, read_status, read_title, read_notes = read_back
|
||||
|
||||
if orig_status == read_status and orig_title == read_title:
|
||||
icon = {"incomplete": "⏳", "complete": "✅", "cleared": "❌"}[
|
||||
orig_status
|
||||
]
|
||||
print(f" {icon} {orig_status.upper()}: {orig_title} - ✓ Match")
|
||||
else:
|
||||
print(f" ❌ Mismatch:")
|
||||
print(f" Original: {orig_status}, '{orig_title}'")
|
||||
print(f" Read: {read_status}, '{read_title}'")
|
||||
|
||||
print("\n4. File content generated:")
|
||||
print("-" * 25)
|
||||
with open(test_file, "r") as f:
|
||||
content = f.read()
|
||||
print(content)
|
||||
|
||||
print("5. API update simulation:")
|
||||
print("-" * 27)
|
||||
print("For cancelled task ([-]), would send:")
|
||||
print(" PATCH /tasks/xyz {'is_complete': True, 'is_cleared': True}")
|
||||
print("\nFor completed task ([x]), would send:")
|
||||
print(" PATCH /tasks/abc {'is_complete': True, 'is_cleared': False}")
|
||||
print("\nFor incomplete task ([ ]), would send:")
|
||||
print(" PATCH /tasks/def {'is_complete': False, 'is_cleared': False}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
test_cancelled_task_parsing()
|
||||
123
test_completion_status.py
Normal file
123
test_completion_status.py
Normal file
@@ -0,0 +1,123 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Quick test to verify completion status handling in Godspeed sync.
|
||||
"""
|
||||
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
# Test the markdown parsing for completion status
|
||||
def test_completion_parsing():
|
||||
print("Testing completion status parsing...")
|
||||
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
sync_dir = Path(temp_dir)
|
||||
sync_engine = GodspeedSync(None, sync_dir)
|
||||
|
||||
# Test different completion states
|
||||
test_lines = [
|
||||
"- [ ] Incomplete task <!-- id:abc123 -->",
|
||||
"- [x] Completed task <!-- id:def456 -->",
|
||||
"- [X] Also completed <!-- id:ghi789 -->", # Capital X
|
||||
"- [ ] Another incomplete <!-- id:jkl012 -->",
|
||||
]
|
||||
|
||||
for line in test_lines:
|
||||
parsed = sync_engine._parse_task_line(line)
|
||||
if parsed:
|
||||
local_id, status, title, notes = parsed
|
||||
display = "✓ Complete" if status == "complete" else "○ Incomplete"
|
||||
print(f" {display}: {title} (ID: {local_id})")
|
||||
else:
|
||||
print(f" Failed to parse: {line}")
|
||||
|
||||
|
||||
def test_format_task():
|
||||
print("\nTesting task formatting...")
|
||||
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
sync_dir = Path(temp_dir)
|
||||
sync_engine = GodspeedSync(None, sync_dir)
|
||||
|
||||
# Test formatting both completion states
|
||||
incomplete_line = sync_engine._format_task_line(
|
||||
"abc123", "incomplete", "Buy milk", ""
|
||||
)
|
||||
completed_line = sync_engine._format_task_line(
|
||||
"def456", "complete", "Call mom", ""
|
||||
)
|
||||
with_notes_line = sync_engine._format_task_line(
|
||||
"ghi789", "incomplete", "Project work", "Due Friday"
|
||||
)
|
||||
|
||||
print(f" Incomplete: {incomplete_line}")
|
||||
print(f" Completed: {completed_line}")
|
||||
print(f" With notes: {with_notes_line}")
|
||||
|
||||
|
||||
def test_roundtrip():
|
||||
print("\nTesting roundtrip parsing...")
|
||||
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
sync_dir = Path(temp_dir)
|
||||
sync_engine = GodspeedSync(None, sync_dir)
|
||||
|
||||
# Original tasks with different completion states
|
||||
original_tasks = [
|
||||
("task1", "incomplete", "Buy groceries", "From whole foods"),
|
||||
("task2", "complete", "Call dentist", ""),
|
||||
("task3", "incomplete", "Finish report", "Due Monday"),
|
||||
("task4", "complete", "Exercise", "Went for a run"),
|
||||
]
|
||||
|
||||
# Write to file
|
||||
test_file = sync_dir / "test_roundtrip.md"
|
||||
sync_engine._write_list_file(test_file, original_tasks)
|
||||
|
||||
# Read back
|
||||
read_tasks = sync_engine._read_list_file(test_file)
|
||||
|
||||
print(f" Original: {len(original_tasks)} tasks")
|
||||
print(f" Read back: {len(read_tasks)} tasks")
|
||||
|
||||
for i, (original, read_back) in enumerate(zip(original_tasks, read_tasks)):
|
||||
orig_id, orig_status, orig_title, orig_notes = original
|
||||
read_id, read_status, read_title, read_notes = read_back
|
||||
|
||||
if orig_status == read_status and orig_title == read_title:
|
||||
display = "✓ Complete" if orig_status == "complete" else "○ Incomplete"
|
||||
print(f" {display}: {orig_title} - ✓ Match")
|
||||
else:
|
||||
print(f" ✗ Mismatch on task {i + 1}:")
|
||||
print(f" Original: status={orig_status}, title='{orig_title}'")
|
||||
print(f" Read: status={read_status}, title='{read_title}'")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
print("=== Godspeed Completion Status Test ===\n")
|
||||
|
||||
try:
|
||||
test_completion_parsing()
|
||||
test_format_task()
|
||||
test_roundtrip()
|
||||
|
||||
print("\n=== Test Summary ===")
|
||||
print("✓ Completion status handling is working correctly!")
|
||||
print("\nExpected behavior:")
|
||||
print("- [ ] tasks sync as incomplete (is_complete=False)")
|
||||
print("- [x] tasks sync as completed (is_complete=True)")
|
||||
print("- Status changes in markdown files will sync to Godspeed")
|
||||
print("- Status changes in Godspeed will sync to markdown files")
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n✗ Test failed: {e}")
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
270
test_godspeed_sync.py
Normal file
270
test_godspeed_sync.py
Normal file
@@ -0,0 +1,270 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Test script for Godspeed sync functionality.
|
||||
|
||||
This script demonstrates the Godspeed sync tool by creating sample data
|
||||
and testing various sync scenarios.
|
||||
"""
|
||||
|
||||
import os
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
import json
|
||||
|
||||
# Mock data for testing without real API calls
|
||||
MOCK_LISTS = [
|
||||
{"id": "list-1", "name": "Personal"},
|
||||
{"id": "list-2", "name": "Work Projects"},
|
||||
{"id": "list-3", "name": "Shopping"},
|
||||
]
|
||||
|
||||
MOCK_TASKS = [
|
||||
{
|
||||
"id": "task-1",
|
||||
"title": "Buy groceries",
|
||||
"list_id": "list-3",
|
||||
"is_complete": False,
|
||||
"notes": "Don't forget milk and eggs",
|
||||
},
|
||||
{
|
||||
"id": "task-2",
|
||||
"title": "Finish quarterly report",
|
||||
"list_id": "list-2",
|
||||
"is_complete": False,
|
||||
"notes": "Due Friday",
|
||||
},
|
||||
{
|
||||
"id": "task-3",
|
||||
"title": "Call dentist",
|
||||
"list_id": "list-1",
|
||||
"is_complete": True,
|
||||
"notes": "",
|
||||
},
|
||||
{
|
||||
"id": "task-4",
|
||||
"title": "Review pull requests",
|
||||
"list_id": "list-2",
|
||||
"is_complete": False,
|
||||
"notes": "Check PR #123 and #124",
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
class MockGodspeedClient:
|
||||
"""Mock client for testing without hitting real API."""
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
pass
|
||||
|
||||
def get_lists(self):
|
||||
return MOCK_LISTS
|
||||
|
||||
def get_tasks(self, **kwargs):
|
||||
filtered_tasks = MOCK_TASKS
|
||||
if kwargs.get("list_id"):
|
||||
filtered_tasks = [
|
||||
t for t in MOCK_TASKS if t["list_id"] == kwargs["list_id"]
|
||||
]
|
||||
if kwargs.get("status"):
|
||||
if kwargs["status"] == "complete":
|
||||
filtered_tasks = [t for t in filtered_tasks if t["is_complete"]]
|
||||
elif kwargs["status"] == "incomplete":
|
||||
filtered_tasks = [t for t in filtered_tasks if not t["is_complete"]]
|
||||
|
||||
# Mock the API response format
|
||||
lists_dict = {lst["id"]: lst for lst in MOCK_LISTS}
|
||||
return {"tasks": filtered_tasks, "lists": lists_dict}
|
||||
|
||||
def create_task(self, **kwargs):
|
||||
new_task = {
|
||||
"id": f"task-{len(MOCK_TASKS) + 1}",
|
||||
"title": kwargs["title"],
|
||||
"list_id": kwargs.get("list_id"),
|
||||
"is_complete": False,
|
||||
"notes": kwargs.get("notes", ""),
|
||||
}
|
||||
MOCK_TASKS.append(new_task)
|
||||
return new_task
|
||||
|
||||
def update_task(self, task_id, **kwargs):
|
||||
for task in MOCK_TASKS:
|
||||
if task["id"] == task_id:
|
||||
task.update(kwargs)
|
||||
return task
|
||||
raise Exception(f"Task {task_id} not found")
|
||||
|
||||
def complete_task(self, task_id):
|
||||
return self.update_task(task_id, is_complete=True)
|
||||
|
||||
|
||||
def test_markdown_parsing():
|
||||
"""Test markdown task parsing functionality."""
|
||||
print("Testing markdown parsing...")
|
||||
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
# Create temporary directory
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
sync_dir = Path(temp_dir)
|
||||
sync_engine = GodspeedSync(None, sync_dir)
|
||||
|
||||
# Test task line parsing
|
||||
test_lines = [
|
||||
"- [ ] Simple task <!-- id:abc123 -->",
|
||||
"- [x] Completed task <!-- id:def456 -->",
|
||||
"- [ ] Task with notes <!-- id:ghi789 -->\n Some additional notes here",
|
||||
"- [ ] New task without ID",
|
||||
]
|
||||
|
||||
for line in test_lines:
|
||||
parsed = sync_engine._parse_task_line(line)
|
||||
if parsed:
|
||||
local_id, is_complete, title, notes = parsed
|
||||
print(f" Parsed: {title} (ID: {local_id}, Complete: {is_complete})")
|
||||
if notes:
|
||||
print(f" Notes: {notes}")
|
||||
else:
|
||||
print(f" Failed to parse: {line}")
|
||||
|
||||
|
||||
def test_file_operations():
|
||||
"""Test file reading and writing operations."""
|
||||
print("\nTesting file operations...")
|
||||
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
sync_dir = Path(temp_dir)
|
||||
sync_engine = GodspeedSync(None, sync_dir)
|
||||
|
||||
# Create test tasks
|
||||
test_tasks = [
|
||||
("abc123", False, "Buy milk", "From the grocery store"),
|
||||
("def456", True, "Call mom", ""),
|
||||
("ghi789", False, "Finish project", "Due next week"),
|
||||
]
|
||||
|
||||
# Write tasks to file
|
||||
test_file = sync_dir / "test_list.md"
|
||||
sync_engine._write_list_file(test_file, test_tasks)
|
||||
print(f" Created test file: {test_file}")
|
||||
|
||||
# Read tasks back
|
||||
read_tasks = sync_engine._read_list_file(test_file)
|
||||
print(f" Read {len(read_tasks)} tasks back from file")
|
||||
|
||||
for i, (original, read_back) in enumerate(zip(test_tasks, read_tasks)):
|
||||
if original == read_back:
|
||||
print(f" Task {i + 1}: ✓ Match")
|
||||
else:
|
||||
print(f" Task {i + 1}: ✗ Mismatch")
|
||||
print(f" Original: {original}")
|
||||
print(f" Read back: {read_back}")
|
||||
|
||||
|
||||
def test_mock_sync():
|
||||
"""Test sync operations with mock data."""
|
||||
print("\nTesting sync with mock data...")
|
||||
|
||||
# Temporarily replace the real client
|
||||
import src.services.godspeed.sync as sync_module
|
||||
|
||||
original_client_class = sync_module.GodspeedClient
|
||||
sync_module.GodspeedClient = MockGodspeedClient
|
||||
|
||||
try:
|
||||
from src.services.godspeed.sync import GodspeedSync
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
sync_dir = Path(temp_dir)
|
||||
|
||||
# Create mock client and sync engine
|
||||
mock_client = MockGodspeedClient()
|
||||
sync_engine = GodspeedSync(mock_client, sync_dir)
|
||||
|
||||
# Test download
|
||||
print(" Testing download...")
|
||||
sync_engine.download_from_api()
|
||||
|
||||
# Check created files
|
||||
md_files = list(sync_dir.glob("*.md"))
|
||||
print(f" Created {len(md_files)} markdown files")
|
||||
|
||||
for md_file in md_files:
|
||||
tasks = sync_engine._read_list_file(md_file)
|
||||
print(f" {md_file.name}: {len(tasks)} tasks")
|
||||
|
||||
# Test status
|
||||
status = sync_engine.get_sync_status()
|
||||
print(
|
||||
f" Status: {status['local_files']} files, {status['total_local_tasks']} tasks"
|
||||
)
|
||||
|
||||
# Test upload (modify a file first)
|
||||
if md_files:
|
||||
first_file = md_files[0]
|
||||
with open(first_file, "a") as f:
|
||||
f.write("- [ ] New local task <!-- id:newlocal -->\n")
|
||||
|
||||
print(" Testing upload...")
|
||||
sync_engine.upload_to_api()
|
||||
print(
|
||||
f" Upload completed, now {len(MOCK_TASKS)} total tasks in mock data"
|
||||
)
|
||||
|
||||
finally:
|
||||
# Restore original client
|
||||
sync_module.GodspeedClient = original_client_class
|
||||
|
||||
|
||||
def test_cli_integration():
|
||||
"""Test CLI commands (without real API calls)."""
|
||||
print("\nTesting CLI integration...")
|
||||
|
||||
# Test that imports work
|
||||
try:
|
||||
from src.cli.godspeed import godspeed, get_sync_directory
|
||||
|
||||
print(" ✓ CLI imports successful")
|
||||
|
||||
# Test sync directory detection
|
||||
sync_dir = get_sync_directory()
|
||||
print(f" ✓ Sync directory: {sync_dir}")
|
||||
|
||||
except ImportError as e:
|
||||
print(f" ✗ CLI import failed: {e}")
|
||||
|
||||
|
||||
def main():
|
||||
"""Run all tests."""
|
||||
print("=== Godspeed Sync Test Suite ===\n")
|
||||
|
||||
try:
|
||||
test_markdown_parsing()
|
||||
test_file_operations()
|
||||
test_mock_sync()
|
||||
test_cli_integration()
|
||||
|
||||
print("\n=== Test Summary ===")
|
||||
print("✓ All tests completed successfully!")
|
||||
print("\nTo use the real Godspeed sync:")
|
||||
print("1. Set environment variables:")
|
||||
print(" export GODSPEED_EMAIL='your@email.com'")
|
||||
print(" export GODSPEED_PASSWORD='your-password'")
|
||||
print(" # OR")
|
||||
print(" export GODSPEED_TOKEN='your-api-token'")
|
||||
print("")
|
||||
print("2. Run sync commands:")
|
||||
print(" python -m src.cli.godspeed download")
|
||||
print(" python -m src.cli.godspeed status")
|
||||
print(" python -m src.cli.godspeed sync")
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n✗ Test failed: {e}")
|
||||
import traceback
|
||||
|
||||
traceback.print_exc()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
218
test_task_sweeper.py
Normal file
218
test_task_sweeper.py
Normal file
@@ -0,0 +1,218 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Test and demo script for the task sweeper functionality.
|
||||
"""
|
||||
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
import os
|
||||
import sys
|
||||
|
||||
|
||||
def create_test_structure():
|
||||
"""Create a test directory structure with scattered tasks."""
|
||||
# Add the project root to path so we can import
|
||||
project_root = Path(__file__).parent
|
||||
sys.path.insert(0, str(project_root))
|
||||
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
base_dir = Path(temp_dir)
|
||||
|
||||
print("🏗️ Creating test directory structure...")
|
||||
|
||||
# Create year directories with markdown files
|
||||
(base_dir / "2024" / "projects").mkdir(parents=True)
|
||||
(base_dir / "2024" / "notes").mkdir(parents=True)
|
||||
(base_dir / "2025" / "planning").mkdir(parents=True)
|
||||
(base_dir / "archive").mkdir(parents=True)
|
||||
(base_dir / "godspeed").mkdir(parents=True)
|
||||
|
||||
# Create test files with various tasks
|
||||
test_files = {
|
||||
"2024/projects/website.md": """# Website Redesign Project
|
||||
|
||||
## Overview
|
||||
This project aims to redesign our company website.
|
||||
|
||||
## Tasks
|
||||
- [x] Create wireframes <!-- id:wire123 -->
|
||||
- [ ] Design mockups <!-- id:mock456 -->
|
||||
Need to use new brand colors
|
||||
- [ ] Get client approval <!-- id:appr789 -->
|
||||
- [-] Old approach that was cancelled <!-- id:old999 -->
|
||||
|
||||
## Notes
|
||||
The wireframes are complete and approved.
|
||||
""",
|
||||
"2024/notes/meeting-notes.md": """# Weekly Team Meeting - Dec 15
|
||||
|
||||
## Attendees
|
||||
- Alice, Bob, Charlie
|
||||
|
||||
## Action Items
|
||||
- [ ] Alice: Update documentation <!-- id:doc123 -->
|
||||
- [x] Bob: Fix bug #456 <!-- id:bug456 -->
|
||||
- [ ] Charlie: Review PR #789 <!-- id:pr789 -->
|
||||
Needs to be done by Friday
|
||||
|
||||
## Discussion
|
||||
We discussed the quarterly goals.
|
||||
""",
|
||||
"2025/planning/goals.md": """# 2025 Goals
|
||||
|
||||
## Q1 Objectives
|
||||
- [ ] Launch new feature <!-- id:feat2025 -->
|
||||
- [ ] Improve performance by 20% <!-- id:perf2025 -->
|
||||
Focus on database queries
|
||||
|
||||
## Q2 Ideas
|
||||
- [ ] Consider mobile app <!-- id:mobile2025 -->
|
||||
|
||||
Some general notes about the year ahead.
|
||||
""",
|
||||
"archive/old-project.md": """# Old Project (Archived)
|
||||
|
||||
This project is mostly done but has some lingering tasks.
|
||||
|
||||
- [x] Phase 1 complete <!-- id:p1done -->
|
||||
- [-] Phase 2 cancelled <!-- id:p2cancel -->
|
||||
- [ ] Cleanup remaining files <!-- id:cleanup123 -->
|
||||
Need to remove temp directories
|
||||
""",
|
||||
"random-notes.md": """# Random Notes
|
||||
|
||||
Just some thoughts and incomplete todos:
|
||||
|
||||
- [ ] Call the dentist <!-- id:dentist99 -->
|
||||
- [ ] Buy groceries <!-- id:grocery99 -->
|
||||
- Milk
|
||||
- Bread
|
||||
- Eggs
|
||||
|
||||
No other tasks here, just notes.
|
||||
""",
|
||||
"godspeed/Personal.md": """# This file should be ignored by the sweeper
|
||||
- [ ] Existing Godspeed task <!-- id:existing1 -->
|
||||
""",
|
||||
}
|
||||
|
||||
# Write test files
|
||||
for rel_path, content in test_files.items():
|
||||
file_path = base_dir / rel_path
|
||||
with open(file_path, "w") as f:
|
||||
f.write(content)
|
||||
|
||||
print(f"📁 Created test structure in: {base_dir}")
|
||||
print(f"📄 Files created:")
|
||||
for rel_path in test_files.keys():
|
||||
print(f" • {rel_path}")
|
||||
|
||||
return base_dir
|
||||
|
||||
|
||||
def test_sweeper():
|
||||
"""Test the task sweeper functionality."""
|
||||
print("=" * 60)
|
||||
print("🧪 TESTING TASK SWEEPER")
|
||||
print("=" * 60)
|
||||
|
||||
# Create test directory
|
||||
with tempfile.TemporaryDirectory() as temp_dir:
|
||||
base_dir = Path(temp_dir)
|
||||
|
||||
# Create the test structure directly here since we can't return from context manager
|
||||
(base_dir / "2024" / "projects").mkdir(parents=True)
|
||||
(base_dir / "2024" / "notes").mkdir(parents=True)
|
||||
(base_dir / "2025" / "planning").mkdir(parents=True)
|
||||
(base_dir / "archive").mkdir(parents=True)
|
||||
(base_dir / "godspeed").mkdir(parents=True)
|
||||
|
||||
test_files = {
|
||||
"2024/projects/website.md": """# Website Redesign Project
|
||||
|
||||
- [x] Create wireframes <!-- id:wire123 -->
|
||||
- [ ] Design mockups <!-- id:mock456 -->
|
||||
Need to use new brand colors
|
||||
- [ ] Get client approval <!-- id:appr789 -->
|
||||
- [-] Old approach that was cancelled <!-- id:old999 -->
|
||||
|
||||
Project notes here.
|
||||
""",
|
||||
"2024/notes/meeting-notes.md": """# Weekly Team Meeting
|
||||
|
||||
- [ ] Alice: Update documentation <!-- id:doc123 -->
|
||||
- [x] Bob: Fix bug #456 <!-- id:bug456 -->
|
||||
- [ ] Charlie: Review PR #789 <!-- id:pr789 -->
|
||||
Needs to be done by Friday
|
||||
""",
|
||||
"2025/planning/goals.md": """# 2025 Goals
|
||||
|
||||
- [ ] Launch new feature <!-- id:feat2025 -->
|
||||
- [ ] Improve performance by 20% <!-- id:perf2025 -->
|
||||
Focus on database queries
|
||||
""",
|
||||
"random-notes.md": """# Random Notes
|
||||
|
||||
- [ ] Call the dentist <!-- id:dentist99 -->
|
||||
- [ ] Buy groceries <!-- id:grocery99 -->
|
||||
|
||||
Just some notes.
|
||||
""",
|
||||
}
|
||||
|
||||
for rel_path, content in test_files.items():
|
||||
file_path = base_dir / rel_path
|
||||
with open(file_path, "w") as f:
|
||||
f.write(content)
|
||||
|
||||
godspeed_dir = base_dir / "godspeed"
|
||||
|
||||
print(f"\n📁 Test directory: {base_dir}")
|
||||
print(f"📥 Godspeed directory: {godspeed_dir}")
|
||||
|
||||
# Import and run the sweeper
|
||||
from sweep_tasks import TaskSweeper
|
||||
|
||||
print(f"\n🧹 Running task sweeper (DRY RUN)...")
|
||||
sweeper = TaskSweeper(base_dir, godspeed_dir, dry_run=True)
|
||||
result = sweeper.sweep_tasks()
|
||||
|
||||
print(f"\n🔍 DRY RUN RESULTS:")
|
||||
print(f" • Would sweep: {result['swept_tasks']} tasks")
|
||||
print(f" • From files: {len(result['processed_files'])}")
|
||||
|
||||
if result["processed_files"]:
|
||||
print(f" • Files that would be modified:")
|
||||
for file_path in result["processed_files"]:
|
||||
print(f" - {file_path}")
|
||||
|
||||
# Now run for real
|
||||
print(f"\n🚀 Running task sweeper (REAL)...")
|
||||
sweeper_real = TaskSweeper(base_dir, godspeed_dir, dry_run=False)
|
||||
result_real = sweeper_real.sweep_tasks()
|
||||
|
||||
# Check the inbox
|
||||
inbox_file = godspeed_dir / "Inbox.md"
|
||||
if inbox_file.exists():
|
||||
print(f"\n📥 Inbox.md contents:")
|
||||
print("-" * 40)
|
||||
with open(inbox_file, "r") as f:
|
||||
print(f.read())
|
||||
print("-" * 40)
|
||||
|
||||
# Check a cleaned file
|
||||
website_file = base_dir / "2024" / "projects" / "website.md"
|
||||
if website_file.exists():
|
||||
print(f"\n📄 Cleaned file (website.md) contents:")
|
||||
print("-" * 30)
|
||||
with open(website_file, "r") as f:
|
||||
print(f.read())
|
||||
print("-" * 30)
|
||||
|
||||
print(f"\n✅ TEST COMPLETE!")
|
||||
print(f" • Swept {result_real['swept_tasks']} incomplete tasks")
|
||||
print(f" • Into: {result_real['inbox_file']}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
test_sweeper()
|
||||
400
tests/test_platform.py
Normal file
400
tests/test_platform.py
Normal file
@@ -0,0 +1,400 @@
|
||||
"""Tests for platform compatibility utilities."""
|
||||
|
||||
import pytest
|
||||
import os
|
||||
import sys
|
||||
import platform
|
||||
from pathlib import Path
|
||||
from unittest.mock import patch, MagicMock
|
||||
|
||||
from src.utils.platform import (
|
||||
get_platform_info,
|
||||
is_supported_platform,
|
||||
get_default_config_dir,
|
||||
get_default_data_dir,
|
||||
get_default_log_dir,
|
||||
get_default_maildir_path,
|
||||
check_dependencies,
|
||||
get_shell_info,
|
||||
get_shell_config_file,
|
||||
get_platform_specific_commands,
|
||||
check_terminal_compatibility,
|
||||
check_textual_support,
|
||||
get_platform_recommendations,
|
||||
validate_environment,
|
||||
)
|
||||
|
||||
|
||||
class TestGetPlatformInfo:
|
||||
"""Tests for get_platform_info function."""
|
||||
|
||||
def test_returns_dict(self):
|
||||
"""Test that get_platform_info returns a dictionary."""
|
||||
info = get_platform_info()
|
||||
assert isinstance(info, dict)
|
||||
|
||||
def test_contains_required_keys(self):
|
||||
"""Test that info contains all required keys."""
|
||||
info = get_platform_info()
|
||||
|
||||
required_keys = [
|
||||
"system",
|
||||
"release",
|
||||
"version",
|
||||
"machine",
|
||||
"processor",
|
||||
"python_version",
|
||||
"python_implementation",
|
||||
]
|
||||
|
||||
for key in required_keys:
|
||||
assert key in info
|
||||
|
||||
def test_values_are_strings(self):
|
||||
"""Test that all values are strings."""
|
||||
info = get_platform_info()
|
||||
|
||||
for value in info.values():
|
||||
assert isinstance(value, str)
|
||||
|
||||
|
||||
class TestIsSupportedPlatform:
|
||||
"""Tests for is_supported_platform function."""
|
||||
|
||||
def test_current_platform_supported(self):
|
||||
"""Test that current platform is supported (if Python 3.12+)."""
|
||||
python_version = tuple(map(int, platform.python_version().split(".")))
|
||||
|
||||
if python_version >= (3, 12):
|
||||
assert is_supported_platform() is True
|
||||
|
||||
@patch("src.utils.platform.platform.python_version")
|
||||
def test_old_python_not_supported(self, mock_version):
|
||||
"""Test that old Python versions are not supported."""
|
||||
mock_version.return_value = "3.10.0"
|
||||
|
||||
assert is_supported_platform() is False
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
@patch("src.utils.platform.platform.python_version")
|
||||
def test_supported_systems(self, mock_version, mock_system):
|
||||
"""Test that Darwin, Linux, Windows are supported."""
|
||||
mock_version.return_value = "3.12.0"
|
||||
|
||||
for system in ["Darwin", "Linux", "Windows"]:
|
||||
mock_system.return_value = system
|
||||
assert is_supported_platform() is True
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
@patch("src.utils.platform.platform.python_version")
|
||||
def test_unsupported_system(self, mock_version, mock_system):
|
||||
"""Test that unknown systems are not supported."""
|
||||
mock_version.return_value = "3.12.0"
|
||||
mock_system.return_value = "Unknown"
|
||||
|
||||
assert is_supported_platform() is False
|
||||
|
||||
|
||||
class TestGetDefaultDirectories:
|
||||
"""Tests for default directory functions."""
|
||||
|
||||
def test_config_dir_returns_path(self):
|
||||
"""Test that get_default_config_dir returns a Path."""
|
||||
result = get_default_config_dir()
|
||||
assert isinstance(result, Path)
|
||||
|
||||
def test_data_dir_returns_path(self):
|
||||
"""Test that get_default_data_dir returns a Path."""
|
||||
result = get_default_data_dir()
|
||||
assert isinstance(result, Path)
|
||||
|
||||
def test_log_dir_returns_path(self):
|
||||
"""Test that get_default_log_dir returns a Path."""
|
||||
result = get_default_log_dir()
|
||||
assert isinstance(result, Path)
|
||||
|
||||
def test_maildir_path_returns_path(self):
|
||||
"""Test that get_default_maildir_path returns a Path."""
|
||||
result = get_default_maildir_path()
|
||||
assert isinstance(result, Path)
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_macos_config_dir(self, mock_system):
|
||||
"""Test macOS config directory."""
|
||||
mock_system.return_value = "Darwin"
|
||||
|
||||
result = get_default_config_dir()
|
||||
|
||||
assert "Library" in str(result)
|
||||
assert "Application Support" in str(result)
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_linux_config_dir_default(self, mock_system):
|
||||
"""Test Linux config directory (default)."""
|
||||
mock_system.return_value = "Linux"
|
||||
|
||||
with patch.dict(os.environ, {}, clear=True):
|
||||
result = get_default_config_dir()
|
||||
|
||||
assert ".config" in str(result)
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_linux_config_dir_xdg(self, mock_system):
|
||||
"""Test Linux config directory with XDG_CONFIG_HOME."""
|
||||
mock_system.return_value = "Linux"
|
||||
|
||||
with patch.dict(os.environ, {"XDG_CONFIG_HOME": "/custom/config"}):
|
||||
result = get_default_config_dir()
|
||||
|
||||
assert "/custom/config" in str(result)
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_windows_config_dir(self, mock_system):
|
||||
"""Test Windows config directory."""
|
||||
mock_system.return_value = "Windows"
|
||||
|
||||
with patch.dict(os.environ, {"APPDATA": "C:\\Users\\test\\AppData\\Roaming"}):
|
||||
result = get_default_config_dir()
|
||||
|
||||
assert "luk" in str(result)
|
||||
|
||||
|
||||
class TestCheckDependencies:
|
||||
"""Tests for check_dependencies function."""
|
||||
|
||||
def test_returns_dict(self):
|
||||
"""Test that check_dependencies returns a dictionary."""
|
||||
result = check_dependencies()
|
||||
assert isinstance(result, dict)
|
||||
|
||||
def test_python_always_available(self):
|
||||
"""Test that Python is always marked as available."""
|
||||
result = check_dependencies()
|
||||
assert result["python"] is True
|
||||
|
||||
def test_contains_expected_keys(self):
|
||||
"""Test that result contains expected dependency keys."""
|
||||
result = check_dependencies()
|
||||
|
||||
expected_keys = ["python", "pip", "git", "curl", "wget"]
|
||||
for key in expected_keys:
|
||||
assert key in result
|
||||
|
||||
def test_values_are_bool(self):
|
||||
"""Test that all values are boolean."""
|
||||
result = check_dependencies()
|
||||
|
||||
for value in result.values():
|
||||
assert isinstance(value, bool)
|
||||
|
||||
|
||||
class TestGetShellInfo:
|
||||
"""Tests for shell info functions."""
|
||||
|
||||
def test_get_shell_info_returns_dict(self):
|
||||
"""Test that get_shell_info returns a dictionary."""
|
||||
result = get_shell_info()
|
||||
assert isinstance(result, dict)
|
||||
|
||||
def test_get_shell_info_keys(self):
|
||||
"""Test that result has expected keys."""
|
||||
result = get_shell_info()
|
||||
|
||||
assert "shell_path" in result
|
||||
assert "shell_name" in result
|
||||
assert "config_file" in result
|
||||
|
||||
def test_get_shell_config_file_bash(self):
|
||||
"""Test shell config file for bash."""
|
||||
result = get_shell_config_file("bash")
|
||||
assert result == "~/.bashrc"
|
||||
|
||||
def test_get_shell_config_file_zsh(self):
|
||||
"""Test shell config file for zsh."""
|
||||
result = get_shell_config_file("zsh")
|
||||
assert result == "~/.zshrc"
|
||||
|
||||
def test_get_shell_config_file_fish(self):
|
||||
"""Test shell config file for fish."""
|
||||
result = get_shell_config_file("fish")
|
||||
assert result == "~/.config/fish/config.fish"
|
||||
|
||||
def test_get_shell_config_file_unknown(self):
|
||||
"""Test shell config file for unknown shell."""
|
||||
result = get_shell_config_file("unknown")
|
||||
assert result == "~/.profile"
|
||||
|
||||
|
||||
class TestGetPlatformSpecificCommands:
|
||||
"""Tests for get_platform_specific_commands function."""
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_macos_commands(self, mock_system):
|
||||
"""Test macOS-specific commands."""
|
||||
mock_system.return_value = "Darwin"
|
||||
|
||||
result = get_platform_specific_commands()
|
||||
|
||||
assert result["open"] == "open"
|
||||
assert result["copy"] == "pbcopy"
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_linux_commands(self, mock_system):
|
||||
"""Test Linux-specific commands."""
|
||||
mock_system.return_value = "Linux"
|
||||
|
||||
result = get_platform_specific_commands()
|
||||
|
||||
assert result["open"] == "xdg-open"
|
||||
assert "xclip" in result["copy"]
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_windows_commands(self, mock_system):
|
||||
"""Test Windows-specific commands."""
|
||||
mock_system.return_value = "Windows"
|
||||
|
||||
result = get_platform_specific_commands()
|
||||
|
||||
assert result["open"] == "start"
|
||||
assert result["copy"] == "clip"
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_unknown_system_commands(self, mock_system):
|
||||
"""Test unknown system returns empty dict."""
|
||||
mock_system.return_value = "Unknown"
|
||||
|
||||
result = get_platform_specific_commands()
|
||||
|
||||
assert result == {}
|
||||
|
||||
|
||||
class TestCheckTerminalCompatibility:
|
||||
"""Tests for terminal compatibility functions."""
|
||||
|
||||
def test_returns_dict(self):
|
||||
"""Test that check_terminal_compatibility returns a dictionary."""
|
||||
result = check_terminal_compatibility()
|
||||
assert isinstance(result, dict)
|
||||
|
||||
def test_contains_expected_keys(self):
|
||||
"""Test that result contains expected keys."""
|
||||
result = check_terminal_compatibility()
|
||||
|
||||
expected_keys = [
|
||||
"color_support",
|
||||
"unicode_support",
|
||||
"mouse_support",
|
||||
"textual_support",
|
||||
]
|
||||
for key in expected_keys:
|
||||
assert key in result
|
||||
|
||||
def test_values_are_bool(self):
|
||||
"""Test that all values are boolean."""
|
||||
result = check_terminal_compatibility()
|
||||
|
||||
for value in result.values():
|
||||
assert isinstance(value, bool)
|
||||
|
||||
def test_check_textual_support(self):
|
||||
"""Test check_textual_support."""
|
||||
# Textual should be available in our test environment
|
||||
result = check_textual_support()
|
||||
assert isinstance(result, bool)
|
||||
|
||||
|
||||
class TestGetPlatformRecommendations:
|
||||
"""Tests for get_platform_recommendations function."""
|
||||
|
||||
def test_returns_list(self):
|
||||
"""Test that get_platform_recommendations returns a list."""
|
||||
result = get_platform_recommendations()
|
||||
assert isinstance(result, list)
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_macos_recommendations(self, mock_system):
|
||||
"""Test macOS recommendations."""
|
||||
mock_system.return_value = "Darwin"
|
||||
|
||||
result = get_platform_recommendations()
|
||||
|
||||
assert len(result) > 0
|
||||
# Check for macOS-specific content
|
||||
assert any("iTerm2" in r or "Terminal.app" in r for r in result)
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_linux_recommendations(self, mock_system):
|
||||
"""Test Linux recommendations."""
|
||||
mock_system.return_value = "Linux"
|
||||
|
||||
result = get_platform_recommendations()
|
||||
|
||||
assert len(result) > 0
|
||||
# Check for Linux-specific content
|
||||
assert any("UTF-8" in r or "GNOME" in r for r in result)
|
||||
|
||||
@patch("src.utils.platform.platform.system")
|
||||
def test_windows_recommendations(self, mock_system):
|
||||
"""Test Windows recommendations."""
|
||||
mock_system.return_value = "Windows"
|
||||
|
||||
result = get_platform_recommendations()
|
||||
|
||||
assert len(result) > 0
|
||||
# Check for Windows-specific content
|
||||
assert any("Windows Terminal" in r or "WSL" in r for r in result)
|
||||
|
||||
|
||||
class TestValidateEnvironment:
|
||||
"""Tests for validate_environment function."""
|
||||
|
||||
def test_returns_dict(self):
|
||||
"""Test that validate_environment returns a dictionary."""
|
||||
result = validate_environment()
|
||||
assert isinstance(result, dict)
|
||||
|
||||
def test_contains_required_keys(self):
|
||||
"""Test that result contains required keys."""
|
||||
result = validate_environment()
|
||||
|
||||
required_keys = [
|
||||
"platform_supported",
|
||||
"platform_info",
|
||||
"dependencies",
|
||||
"terminal_compatibility",
|
||||
"recommendations",
|
||||
"config_dir",
|
||||
"data_dir",
|
||||
"log_dir",
|
||||
]
|
||||
|
||||
for key in required_keys:
|
||||
assert key in result
|
||||
|
||||
def test_platform_supported_is_bool(self):
|
||||
"""Test that platform_supported is boolean."""
|
||||
result = validate_environment()
|
||||
assert isinstance(result["platform_supported"], bool)
|
||||
|
||||
def test_platform_info_is_dict(self):
|
||||
"""Test that platform_info is a dictionary."""
|
||||
result = validate_environment()
|
||||
assert isinstance(result["platform_info"], dict)
|
||||
|
||||
def test_dependencies_is_dict(self):
|
||||
"""Test that dependencies is a dictionary."""
|
||||
result = validate_environment()
|
||||
assert isinstance(result["dependencies"], dict)
|
||||
|
||||
def test_recommendations_is_list(self):
|
||||
"""Test that recommendations is a list."""
|
||||
result = validate_environment()
|
||||
assert isinstance(result["recommendations"], list)
|
||||
|
||||
def test_directory_paths_are_strings(self):
|
||||
"""Test that directory paths are strings."""
|
||||
result = validate_environment()
|
||||
|
||||
assert isinstance(result["config_dir"], str)
|
||||
assert isinstance(result["data_dir"], str)
|
||||
assert isinstance(result["log_dir"], str)
|
||||
239
tests/test_sync_daemon.py
Normal file
239
tests/test_sync_daemon.py
Normal file
@@ -0,0 +1,239 @@
|
||||
"""Tests for the sync daemon."""
|
||||
|
||||
import pytest
|
||||
import os
|
||||
import tempfile
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from unittest.mock import MagicMock, patch, AsyncMock
|
||||
|
||||
from src.cli.sync_daemon import (
|
||||
SyncDaemon,
|
||||
create_daemon_config,
|
||||
)
|
||||
|
||||
|
||||
class TestCreateDaemonConfig:
|
||||
"""Tests for create_daemon_config function."""
|
||||
|
||||
def test_default_config(self):
|
||||
"""Test default configuration values."""
|
||||
config = create_daemon_config()
|
||||
|
||||
assert config["dry_run"] is False
|
||||
assert config["vdir"] == "~/Calendar"
|
||||
assert config["icsfile"] is None
|
||||
assert config["org"] == "corteva"
|
||||
assert config["days_back"] == 1
|
||||
assert config["days_forward"] == 30
|
||||
assert config["continue_iteration"] is False
|
||||
assert config["download_attachments"] is False
|
||||
assert config["two_way_calendar"] is False
|
||||
assert config["notify"] is False
|
||||
assert config["sync_interval"] == 300
|
||||
assert config["check_interval"] == 10
|
||||
|
||||
def test_custom_config(self):
|
||||
"""Test custom configuration values."""
|
||||
config = create_daemon_config(
|
||||
dry_run=True,
|
||||
org="mycompany",
|
||||
vdir="~/MyCalendar",
|
||||
notify=True,
|
||||
sync_interval=600,
|
||||
)
|
||||
|
||||
assert config["dry_run"] is True
|
||||
assert config["org"] == "mycompany"
|
||||
assert config["vdir"] == "~/MyCalendar"
|
||||
assert config["notify"] is True
|
||||
assert config["sync_interval"] == 600
|
||||
|
||||
def test_pid_file_default(self):
|
||||
"""Test default PID file path."""
|
||||
config = create_daemon_config()
|
||||
|
||||
assert "luk.pid" in config["pid_file"]
|
||||
|
||||
def test_log_file_default(self):
|
||||
"""Test default log file path."""
|
||||
config = create_daemon_config()
|
||||
|
||||
assert "luk.log" in config["log_file"]
|
||||
|
||||
|
||||
class TestSyncDaemon:
|
||||
"""Tests for the SyncDaemon class."""
|
||||
|
||||
@pytest.fixture
|
||||
def temp_dir(self):
|
||||
"""Create a temporary directory for tests."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
yield Path(tmpdir)
|
||||
|
||||
@pytest.fixture
|
||||
def daemon_config(self, temp_dir):
|
||||
"""Create a daemon config with temp paths."""
|
||||
return create_daemon_config(
|
||||
pid_file=str(temp_dir / "test_daemon.pid"),
|
||||
log_file=str(temp_dir / "test_daemon.log"),
|
||||
)
|
||||
|
||||
def test_init(self, daemon_config):
|
||||
"""Test daemon initialization."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
assert daemon.config == daemon_config
|
||||
assert daemon.running is False
|
||||
assert daemon.sync_interval == 300
|
||||
assert daemon.check_interval == 10
|
||||
|
||||
def test_init_custom_intervals(self, temp_dir):
|
||||
"""Test daemon with custom intervals."""
|
||||
config = create_daemon_config(
|
||||
pid_file=str(temp_dir / "test_daemon.pid"),
|
||||
log_file=str(temp_dir / "test_daemon.log"),
|
||||
sync_interval=600,
|
||||
check_interval=30,
|
||||
)
|
||||
|
||||
daemon = SyncDaemon(config)
|
||||
|
||||
assert daemon.sync_interval == 600
|
||||
assert daemon.check_interval == 30
|
||||
|
||||
def test_setup_logging(self, daemon_config, temp_dir):
|
||||
"""Test logging setup."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
# Logger should be configured
|
||||
assert daemon.logger is not None
|
||||
assert daemon.logger.level == logging.INFO
|
||||
|
||||
# Log directory should be created
|
||||
log_file = temp_dir / "test_daemon.log"
|
||||
assert log_file.parent.exists()
|
||||
|
||||
def test_is_running_no_pid_file(self, daemon_config):
|
||||
"""Test is_running when no PID file exists."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
assert daemon.is_running() is False
|
||||
|
||||
def test_is_running_stale_pid_file(self, daemon_config, temp_dir):
|
||||
"""Test is_running with stale PID file."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
# Create a PID file with non-existent process
|
||||
pid_file = temp_dir / "test_daemon.pid"
|
||||
pid_file.write_text("99999999") # Very unlikely to exist
|
||||
|
||||
# Should detect stale PID and clean up
|
||||
assert daemon.is_running() is False
|
||||
assert not pid_file.exists()
|
||||
|
||||
def test_get_pid(self, daemon_config, temp_dir):
|
||||
"""Test get_pid reads PID correctly."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
# Create a PID file
|
||||
pid_file = temp_dir / "test_daemon.pid"
|
||||
pid_file.write_text("12345")
|
||||
|
||||
assert daemon.get_pid() == 12345
|
||||
|
||||
def test_stop_not_running(self, daemon_config, capsys):
|
||||
"""Test stop when daemon is not running."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
daemon.stop()
|
||||
|
||||
captured = capsys.readouterr()
|
||||
assert "not running" in captured.out.lower()
|
||||
|
||||
def test_status_not_running(self, daemon_config, capsys):
|
||||
"""Test status when daemon is not running."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
daemon.status()
|
||||
|
||||
captured = capsys.readouterr()
|
||||
assert "not running" in captured.out.lower()
|
||||
|
||||
|
||||
class TestSyncDaemonAsync:
|
||||
"""Tests for async daemon methods."""
|
||||
|
||||
@pytest.fixture
|
||||
def temp_dir(self):
|
||||
"""Create a temporary directory for tests."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
yield Path(tmpdir)
|
||||
|
||||
@pytest.fixture
|
||||
def daemon_config(self, temp_dir):
|
||||
"""Create a daemon config with temp paths."""
|
||||
return create_daemon_config(
|
||||
pid_file=str(temp_dir / "test_daemon.pid"),
|
||||
log_file=str(temp_dir / "test_daemon.log"),
|
||||
)
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_check_for_changes_no_changes(self, daemon_config):
|
||||
"""Test _check_for_changes when no changes."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
with patch("src.cli.sync_daemon.should_run_godspeed_sync", return_value=False):
|
||||
with patch("src.cli.sync_daemon.should_run_sweep", return_value=False):
|
||||
result = await daemon._check_for_changes()
|
||||
|
||||
assert result is False
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_check_for_changes_godspeed_due(self, daemon_config):
|
||||
"""Test _check_for_changes when godspeed sync is due."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
with patch("src.cli.sync_daemon.should_run_godspeed_sync", return_value=True):
|
||||
with patch("src.cli.sync_daemon.should_run_sweep", return_value=False):
|
||||
result = await daemon._check_for_changes()
|
||||
|
||||
assert result is True
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_check_for_changes_sweep_due(self, daemon_config):
|
||||
"""Test _check_for_changes when sweep is due."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
with patch("src.cli.sync_daemon.should_run_godspeed_sync", return_value=False):
|
||||
with patch("src.cli.sync_daemon.should_run_sweep", return_value=True):
|
||||
result = await daemon._check_for_changes()
|
||||
|
||||
assert result is True
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_perform_sync_success(self, daemon_config):
|
||||
"""Test _perform_sync success."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
with patch(
|
||||
"src.cli.sync_daemon._sync_outlook_data", new_callable=AsyncMock
|
||||
) as mock_sync:
|
||||
await daemon._perform_sync()
|
||||
|
||||
mock_sync.assert_called_once()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_perform_sync_failure(self, daemon_config):
|
||||
"""Test _perform_sync handles failure."""
|
||||
daemon = SyncDaemon(daemon_config)
|
||||
|
||||
with patch(
|
||||
"src.cli.sync_daemon._sync_outlook_data", new_callable=AsyncMock
|
||||
) as mock_sync:
|
||||
mock_sync.side_effect = Exception("Sync failed")
|
||||
|
||||
# Should not raise
|
||||
await daemon._perform_sync()
|
||||
|
||||
# Error should be logged (we'd need to check the log file)
|
||||
235
tests/test_sync_dashboard.py
Normal file
235
tests/test_sync_dashboard.py
Normal file
@@ -0,0 +1,235 @@
|
||||
"""Tests for the sync dashboard TUI."""
|
||||
|
||||
import pytest
|
||||
from unittest.mock import MagicMock, patch
|
||||
from datetime import datetime
|
||||
|
||||
from src.cli.sync_dashboard import (
|
||||
SyncDashboard,
|
||||
SyncProgressTracker,
|
||||
TaskStatus,
|
||||
TaskListItem,
|
||||
get_dashboard,
|
||||
get_progress_tracker,
|
||||
)
|
||||
|
||||
|
||||
class TestSyncProgressTracker:
|
||||
"""Tests for the SyncProgressTracker class."""
|
||||
|
||||
def test_init(self):
|
||||
"""Test tracker initialization."""
|
||||
mock_dashboard = MagicMock()
|
||||
tracker = SyncProgressTracker(mock_dashboard)
|
||||
|
||||
assert tracker.dashboard == mock_dashboard
|
||||
|
||||
def test_start_task(self):
|
||||
"""Test starting a new task."""
|
||||
mock_dashboard = MagicMock()
|
||||
tracker = SyncProgressTracker(mock_dashboard)
|
||||
|
||||
tracker.start_task("inbox", 100)
|
||||
|
||||
mock_dashboard.start_task.assert_called_once_with("inbox", 100)
|
||||
|
||||
def test_start_task_default_total(self):
|
||||
"""Test starting a task with default total."""
|
||||
mock_dashboard = MagicMock()
|
||||
tracker = SyncProgressTracker(mock_dashboard)
|
||||
|
||||
tracker.start_task("calendar")
|
||||
|
||||
mock_dashboard.start_task.assert_called_once_with("calendar", 100)
|
||||
|
||||
def test_update_task(self):
|
||||
"""Test updating task progress."""
|
||||
mock_dashboard = MagicMock()
|
||||
tracker = SyncProgressTracker(mock_dashboard)
|
||||
|
||||
tracker.update_task("inbox", 50, "Processing...")
|
||||
|
||||
mock_dashboard.update_task.assert_called_once_with("inbox", 50, "Processing...")
|
||||
|
||||
def test_update_task_without_message(self):
|
||||
"""Test updating task without message."""
|
||||
mock_dashboard = MagicMock()
|
||||
tracker = SyncProgressTracker(mock_dashboard)
|
||||
|
||||
tracker.update_task("inbox", 75)
|
||||
|
||||
mock_dashboard.update_task.assert_called_once_with("inbox", 75, "")
|
||||
|
||||
def test_complete_task(self):
|
||||
"""Test completing a task."""
|
||||
mock_dashboard = MagicMock()
|
||||
tracker = SyncProgressTracker(mock_dashboard)
|
||||
|
||||
tracker.complete_task("inbox", "Done!")
|
||||
|
||||
mock_dashboard.complete_task.assert_called_once_with("inbox", "Done!")
|
||||
|
||||
def test_complete_task_no_message(self):
|
||||
"""Test completing a task without a message."""
|
||||
mock_dashboard = MagicMock()
|
||||
tracker = SyncProgressTracker(mock_dashboard)
|
||||
|
||||
tracker.complete_task("inbox")
|
||||
|
||||
mock_dashboard.complete_task.assert_called_once_with("inbox", "")
|
||||
|
||||
def test_error_task(self):
|
||||
"""Test marking a task as failed."""
|
||||
mock_dashboard = MagicMock()
|
||||
tracker = SyncProgressTracker(mock_dashboard)
|
||||
|
||||
tracker.error_task("inbox", "Connection failed")
|
||||
|
||||
mock_dashboard.error_task.assert_called_once_with("inbox", "Connection failed")
|
||||
|
||||
def test_skip_task(self):
|
||||
"""Test skipping a task."""
|
||||
mock_dashboard = MagicMock()
|
||||
tracker = SyncProgressTracker(mock_dashboard)
|
||||
|
||||
tracker.skip_task("sweep", "Not needed")
|
||||
|
||||
mock_dashboard.skip_task.assert_called_once_with("sweep", "Not needed")
|
||||
|
||||
def test_skip_task_no_reason(self):
|
||||
"""Test skipping a task without a reason."""
|
||||
mock_dashboard = MagicMock()
|
||||
tracker = SyncProgressTracker(mock_dashboard)
|
||||
|
||||
tracker.skip_task("sweep")
|
||||
|
||||
mock_dashboard.skip_task.assert_called_once_with("sweep", "")
|
||||
|
||||
|
||||
class TestTaskListItem:
|
||||
"""Tests for the TaskListItem class."""
|
||||
|
||||
def test_init(self):
|
||||
"""Test TaskListItem initialization."""
|
||||
item = TaskListItem("inbox", "Inbox Sync")
|
||||
|
||||
assert item.task_id == "inbox"
|
||||
assert item.task_name == "Inbox Sync"
|
||||
assert item.status == TaskStatus.PENDING
|
||||
assert item.progress == 0
|
||||
assert item.total == 100
|
||||
|
||||
def test_get_status_icon_pending(self):
|
||||
"""Test status icon for pending."""
|
||||
item = TaskListItem("inbox", "Inbox Sync")
|
||||
assert item._get_status_icon() == "○"
|
||||
|
||||
def test_get_status_icon_running(self):
|
||||
"""Test status icon for running (animated spinner)."""
|
||||
from src.cli.sync_dashboard import SPINNER_FRAMES
|
||||
|
||||
item = TaskListItem("inbox", "Inbox Sync")
|
||||
item.status = TaskStatus.RUNNING
|
||||
# Running uses spinner frames, starts at frame 0
|
||||
assert item._get_status_icon() == SPINNER_FRAMES[0]
|
||||
|
||||
def test_get_status_icon_completed(self):
|
||||
"""Test status icon for completed."""
|
||||
item = TaskListItem("inbox", "Inbox Sync")
|
||||
item.status = TaskStatus.COMPLETED
|
||||
assert item._get_status_icon() == "✓"
|
||||
|
||||
def test_get_status_icon_error(self):
|
||||
"""Test status icon for error."""
|
||||
item = TaskListItem("inbox", "Inbox Sync")
|
||||
item.status = TaskStatus.ERROR
|
||||
assert item._get_status_icon() == "✗"
|
||||
|
||||
def test_get_status_color_pending(self):
|
||||
"""Test status color for pending."""
|
||||
item = TaskListItem("inbox", "Inbox Sync")
|
||||
assert item._get_status_color() == "dim"
|
||||
|
||||
def test_get_status_color_running(self):
|
||||
"""Test status color for running."""
|
||||
item = TaskListItem("inbox", "Inbox Sync")
|
||||
item.status = TaskStatus.RUNNING
|
||||
assert item._get_status_color() == "cyan"
|
||||
|
||||
def test_get_status_color_completed(self):
|
||||
"""Test status color for completed."""
|
||||
item = TaskListItem("inbox", "Inbox Sync")
|
||||
item.status = TaskStatus.COMPLETED
|
||||
assert item._get_status_color() == "bright_white"
|
||||
|
||||
def test_get_status_color_error(self):
|
||||
"""Test status color for error."""
|
||||
item = TaskListItem("inbox", "Inbox Sync")
|
||||
item.status = TaskStatus.ERROR
|
||||
assert item._get_status_color() == "red"
|
||||
|
||||
|
||||
class TestTaskStatus:
|
||||
"""Tests for the TaskStatus constants."""
|
||||
|
||||
def test_status_values(self):
|
||||
"""Test TaskStatus values."""
|
||||
assert TaskStatus.PENDING == "pending"
|
||||
assert TaskStatus.RUNNING == "running"
|
||||
assert TaskStatus.COMPLETED == "completed"
|
||||
assert TaskStatus.ERROR == "error"
|
||||
|
||||
|
||||
class TestSyncDashboard:
|
||||
"""Tests for the SyncDashboard class."""
|
||||
|
||||
def test_bindings_defined(self):
|
||||
"""Test that key bindings are defined."""
|
||||
assert len(SyncDashboard.BINDINGS) > 0
|
||||
|
||||
# Bindings use the Binding class which has a key attribute
|
||||
binding_keys = [b.key for b in SyncDashboard.BINDINGS] # type: ignore
|
||||
assert "q" in binding_keys
|
||||
assert "r" in binding_keys
|
||||
assert "s" in binding_keys # Sync now
|
||||
assert "+" in binding_keys # Increase interval
|
||||
assert "-" in binding_keys # Decrease interval
|
||||
assert "ctrl+c" in binding_keys
|
||||
assert "up" in binding_keys
|
||||
assert "down" in binding_keys
|
||||
|
||||
def test_css_defined(self):
|
||||
"""Test that CSS is defined."""
|
||||
assert SyncDashboard.CSS is not None
|
||||
assert len(SyncDashboard.CSS) > 0
|
||||
assert ".dashboard" in SyncDashboard.CSS
|
||||
assert ".sidebar" in SyncDashboard.CSS
|
||||
assert ".main-panel" in SyncDashboard.CSS
|
||||
|
||||
def test_reactive_selected_task(self):
|
||||
"""Test selected_task reactive attribute is defined."""
|
||||
assert hasattr(SyncDashboard, "selected_task")
|
||||
|
||||
|
||||
class TestGlobalInstances:
|
||||
"""Tests for global dashboard instances."""
|
||||
|
||||
def test_get_dashboard_initial(self):
|
||||
"""Test get_dashboard returns None initially."""
|
||||
# Reset global state
|
||||
import src.cli.sync_dashboard as dashboard_module
|
||||
|
||||
dashboard_module._dashboard_instance = None
|
||||
|
||||
result = get_dashboard()
|
||||
assert result is None
|
||||
|
||||
def test_get_progress_tracker_initial(self):
|
||||
"""Test get_progress_tracker returns None initially."""
|
||||
# Reset global state
|
||||
import src.cli.sync_dashboard as dashboard_module
|
||||
|
||||
dashboard_module._progress_tracker = None
|
||||
|
||||
result = get_progress_tracker()
|
||||
assert result is None
|
||||
723
uv.lock
generated
723
uv.lock
generated
@@ -1,5 +1,5 @@
|
||||
version = 1
|
||||
revision = 2
|
||||
revision = 3
|
||||
requires-python = ">=3.12"
|
||||
resolution-markers = [
|
||||
"python_full_version >= '3.13'",
|
||||
@@ -205,6 +205,52 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/50/cd/30110dc0ffcf3b131156077b90e9f60ed75711223f306da4db08eff8403b/beautifulsoup4-4.13.4-py3-none-any.whl", hash = "sha256:9bbbb14bfde9d79f38b8cd5f8c7c85f4b8f2523190ebed90e950a8dea4cb1c4b", size = 187285, upload-time = "2025-04-15T17:05:12.221Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "black"
|
||||
version = "25.12.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "click" },
|
||||
{ name = "mypy-extensions" },
|
||||
{ name = "packaging" },
|
||||
{ name = "pathspec" },
|
||||
{ name = "platformdirs" },
|
||||
{ name = "pytokens" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c4/d9/07b458a3f1c525ac392b5edc6b191ff140b596f9d77092429417a54e249d/black-25.12.0.tar.gz", hash = "sha256:8d3dd9cea14bff7ddc0eb243c811cdb1a011ebb4800a5f0335a01a68654796a7", size = 659264, upload-time = "2025-12-08T01:40:52.501Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/bd/26083f805115db17fda9877b3c7321d08c647df39d0df4c4ca8f8450593e/black-25.12.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:31f96b7c98c1ddaeb07dc0f56c652e25bdedaac76d5b68a059d998b57c55594a", size = 1924178, upload-time = "2025-12-08T01:49:51.048Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/89/6b/ea00d6651561e2bdd9231c4177f4f2ae19cc13a0b0574f47602a7519b6ca/black-25.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:05dd459a19e218078a1f98178c13f861fe6a9a5f88fc969ca4d9b49eb1809783", size = 1742643, upload-time = "2025-12-08T01:49:59.09Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6d/f3/360fa4182e36e9875fabcf3a9717db9d27a8d11870f21cff97725c54f35b/black-25.12.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c1f68c5eff61f226934be6b5b80296cf6939e5d2f0c2f7d543ea08b204bfaf59", size = 1800158, upload-time = "2025-12-08T01:44:27.301Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f8/08/2c64830cb6616278067e040acca21d4f79727b23077633953081c9445d61/black-25.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:274f940c147ddab4442d316b27f9e332ca586d39c85ecf59ebdea82cc9ee8892", size = 1426197, upload-time = "2025-12-08T01:45:51.198Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d4/60/a93f55fd9b9816b7432cf6842f0e3000fdd5b7869492a04b9011a133ee37/black-25.12.0-cp312-cp312-win_arm64.whl", hash = "sha256:169506ba91ef21e2e0591563deda7f00030cb466e747c4b09cb0a9dae5db2f43", size = 1237266, upload-time = "2025-12-08T01:45:10.556Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/52/c551e36bc95495d2aa1a37d50566267aa47608c81a53f91daa809e03293f/black-25.12.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:a05ddeb656534c3e27a05a29196c962877c83fa5503db89e68857d1161ad08a5", size = 1923809, upload-time = "2025-12-08T01:46:55.126Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a0/f7/aac9b014140ee56d247e707af8db0aae2e9efc28d4a8aba92d0abd7ae9d1/black-25.12.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:9ec77439ef3e34896995503865a85732c94396edcc739f302c5673a2315e1e7f", size = 1742384, upload-time = "2025-12-08T01:49:37.022Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/98/38aaa018b2ab06a863974c12b14a6266badc192b20603a81b738c47e902e/black-25.12.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0e509c858adf63aa61d908061b52e580c40eae0dfa72415fa47ac01b12e29baf", size = 1798761, upload-time = "2025-12-08T01:46:05.386Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/16/3a/a8ac542125f61574a3f015b521ca83b47321ed19bb63fe6d7560f348bfe1/black-25.12.0-cp313-cp313-win_amd64.whl", hash = "sha256:252678f07f5bac4ff0d0e9b261fbb029fa530cfa206d0a636a34ab445ef8ca9d", size = 1429180, upload-time = "2025-12-08T01:45:34.903Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e6/2d/bdc466a3db9145e946762d52cd55b1385509d9f9004fec1c97bdc8debbfb/black-25.12.0-cp313-cp313-win_arm64.whl", hash = "sha256:bc5b1c09fe3c931ddd20ee548511c64ebf964ada7e6f0763d443947fd1c603ce", size = 1239350, upload-time = "2025-12-08T01:46:09.458Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/35/46/1d8f2542210c502e2ae1060b2e09e47af6a5e5963cb78e22ec1a11170b28/black-25.12.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:0a0953b134f9335c2434864a643c842c44fba562155c738a2a37a4d61f00cad5", size = 1917015, upload-time = "2025-12-08T01:53:27.987Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/41/37/68accadf977672beb8e2c64e080f568c74159c1aaa6414b4cd2aef2d7906/black-25.12.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:2355bbb6c3b76062870942d8cc450d4f8ac71f9c93c40122762c8784df49543f", size = 1741830, upload-time = "2025-12-08T01:54:36.861Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ac/76/03608a9d8f0faad47a3af3a3c8c53af3367f6c0dd2d23a84710456c7ac56/black-25.12.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9678bd991cc793e81d19aeeae57966ee02909877cb65838ccffef24c3ebac08f", size = 1791450, upload-time = "2025-12-08T01:44:52.581Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/06/99/b2a4bd7dfaea7964974f947e1c76d6886d65fe5d24f687df2d85406b2609/black-25.12.0-cp314-cp314-win_amd64.whl", hash = "sha256:97596189949a8aad13ad12fcbb4ae89330039b96ad6742e6f6b45e75ad5cfd83", size = 1452042, upload-time = "2025-12-08T01:46:13.188Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b2/7c/d9825de75ae5dd7795d007681b752275ea85a1c5d83269b4b9c754c2aaab/black-25.12.0-cp314-cp314-win_arm64.whl", hash = "sha256:778285d9ea197f34704e3791ea9404cd6d07595745907dd2ce3da7a13627b29b", size = 1267446, upload-time = "2025-12-08T01:46:14.497Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/68/11/21331aed19145a952ad28fca2756a1433ee9308079bd03bd898e903a2e53/black-25.12.0-py3-none-any.whl", hash = "sha256:48ceb36c16dbc84062740049eef990bb2ce07598272e673c17d1a7720c71c828", size = 206191, upload-time = "2025-12-08T01:40:50.963Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "build"
|
||||
version = "1.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "colorama", marker = "os_name == 'nt'" },
|
||||
{ name = "packaging" },
|
||||
{ name = "pyproject-hooks" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/25/1c/23e33405a7c9eac261dff640926b8b5adaed6a6eb3e1767d441ed611d0c0/build-1.3.0.tar.gz", hash = "sha256:698edd0ea270bde950f53aed21f3a0135672206f3911e0176261a31e0e07b397", size = 48544, upload-time = "2025-08-01T21:27:09.268Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/8c/2b30c12155ad8de0cf641d76a8b396a16d2c36bc6d50b621a62b7c4567c1/build-1.3.0-py3-none-any.whl", hash = "sha256:7145f0b5061ba90a1500d60bd1b13ca0a8a4cebdd0cc16ed8adf1c0e739f43b4", size = 23382, upload-time = "2025-08-01T21:27:07.844Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "certifi"
|
||||
version = "2025.4.26"
|
||||
@@ -248,38 +294,21 @@ wheels = [
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "charset-normalizer"
|
||||
version = "3.4.2"
|
||||
name = "cfgv"
|
||||
version = "3.5.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/e4/33/89c2ced2b67d1c2a61c19c6751aa8902d46ce3dacb23600a283619f5a12d/charset_normalizer-3.4.2.tar.gz", hash = "sha256:5baececa9ecba31eff645232d59845c07aa030f0c81ee70184a90d35099a0e63", size = 126367, upload-time = "2025-05-02T08:34:42.01Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/4e/b5/721b8799b04bf9afe054a3899c6cf4e880fcf8563cc71c15610242490a0c/cfgv-3.5.0.tar.gz", hash = "sha256:d5b1034354820651caa73ede66a6294d6e95c1b00acc5e9b098e917404669132", size = 7334, upload-time = "2025-11-19T20:55:51.612Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d7/a4/37f4d6035c89cac7930395a35cc0f1b872e652eaafb76a6075943754f095/charset_normalizer-3.4.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0c29de6a1a95f24b9a1aa7aefd27d2487263f00dfd55a77719b530788f75cff7", size = 199936, upload-time = "2025-05-02T08:32:33.712Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/8a/1a5e33b73e0d9287274f899d967907cd0bf9c343e651755d9307e0dbf2b3/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cddf7bd982eaa998934a91f69d182aec997c6c468898efe6679af88283b498d3", size = 143790, upload-time = "2025-05-02T08:32:35.768Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/66/52/59521f1d8e6ab1482164fa21409c5ef44da3e9f653c13ba71becdd98dec3/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fcbe676a55d7445b22c10967bceaaf0ee69407fbe0ece4d032b6eb8d4565982a", size = 153924, upload-time = "2025-05-02T08:32:37.284Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/2d/fb55fdf41964ec782febbf33cb64be480a6b8f16ded2dbe8db27a405c09f/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d41c4d287cfc69060fa91cae9683eacffad989f1a10811995fa309df656ec214", size = 146626, upload-time = "2025-05-02T08:32:38.803Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8c/73/6ede2ec59bce19b3edf4209d70004253ec5f4e319f9a2e3f2f15601ed5f7/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e594135de17ab3866138f496755f302b72157d115086d100c3f19370839dd3a", size = 148567, upload-time = "2025-05-02T08:32:40.251Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/09/14/957d03c6dc343c04904530b6bef4e5efae5ec7d7990a7cbb868e4595ee30/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cf713fe9a71ef6fd5adf7a79670135081cd4431c2943864757f0fa3a65b1fafd", size = 150957, upload-time = "2025-05-02T08:32:41.705Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/c8/8174d0e5c10ccebdcb1b53cc959591c4c722a3ad92461a273e86b9f5a302/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a370b3e078e418187da8c3674eddb9d983ec09445c99a3a263c2011993522981", size = 145408, upload-time = "2025-05-02T08:32:43.709Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/58/aa/8904b84bc8084ac19dc52feb4f5952c6df03ffb460a887b42615ee1382e8/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a955b438e62efdf7e0b7b52a64dc5c3396e2634baa62471768a64bc2adb73d5c", size = 153399, upload-time = "2025-05-02T08:32:46.197Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c2/26/89ee1f0e264d201cb65cf054aca6038c03b1a0c6b4ae998070392a3ce605/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7222ffd5e4de8e57e03ce2cef95a4c43c98fcb72ad86909abdfc2c17d227fc1b", size = 156815, upload-time = "2025-05-02T08:32:48.105Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fd/07/68e95b4b345bad3dbbd3a8681737b4338ff2c9df29856a6d6d23ac4c73cb/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:bee093bf902e1d8fc0ac143c88902c3dfc8941f7ea1d6a8dd2bcb786d33db03d", size = 154537, upload-time = "2025-05-02T08:32:49.719Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/1a/5eefc0ce04affb98af07bc05f3bac9094513c0e23b0562d64af46a06aae4/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:dedb8adb91d11846ee08bec4c8236c8549ac721c245678282dcb06b221aab59f", size = 149565, upload-time = "2025-05-02T08:32:51.404Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/37/a0/2410e5e6032a174c95e0806b1a6585eb21e12f445ebe239fac441995226a/charset_normalizer-3.4.2-cp312-cp312-win32.whl", hash = "sha256:db4c7bf0e07fc3b7d89ac2a5880a6a8062056801b83ff56d8464b70f65482b6c", size = 98357, upload-time = "2025-05-02T08:32:53.079Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6c/4f/c02d5c493967af3eda9c771ad4d2bbc8df6f99ddbeb37ceea6e8716a32bc/charset_normalizer-3.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:5a9979887252a82fefd3d3ed2a8e3b937a7a809f65dcb1e068b090e165bbe99e", size = 105776, upload-time = "2025-05-02T08:32:54.573Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ea/12/a93df3366ed32db1d907d7593a94f1fe6293903e3e92967bebd6950ed12c/charset_normalizer-3.4.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:926ca93accd5d36ccdabd803392ddc3e03e6d4cd1cf17deff3b989ab8e9dbcf0", size = 199622, upload-time = "2025-05-02T08:32:56.363Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/04/93/bf204e6f344c39d9937d3c13c8cd5bbfc266472e51fc8c07cb7f64fcd2de/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eba9904b0f38a143592d9fc0e19e2df0fa2e41c3c3745554761c5f6447eedabf", size = 143435, upload-time = "2025-05-02T08:32:58.551Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/22/2a/ea8a2095b0bafa6c5b5a55ffdc2f924455233ee7b91c69b7edfcc9e02284/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3fddb7e2c84ac87ac3a947cb4e66d143ca5863ef48e4a5ecb83bd48619e4634e", size = 153653, upload-time = "2025-05-02T08:33:00.342Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b6/57/1b090ff183d13cef485dfbe272e2fe57622a76694061353c59da52c9a659/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98f862da73774290f251b9df8d11161b6cf25b599a66baf087c1ffe340e9bfd1", size = 146231, upload-time = "2025-05-02T08:33:02.081Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e2/28/ffc026b26f441fc67bd21ab7f03b313ab3fe46714a14b516f931abe1a2d8/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c9379d65defcab82d07b2a9dfbfc2e95bc8fe0ebb1b176a3190230a3ef0e07c", size = 148243, upload-time = "2025-05-02T08:33:04.063Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/0f/9abe9bd191629c33e69e47c6ef45ef99773320e9ad8e9cb08b8ab4a8d4cb/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e635b87f01ebc977342e2697d05b56632f5f879a4f15955dfe8cef2448b51691", size = 150442, upload-time = "2025-05-02T08:33:06.418Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/67/7c/a123bbcedca91d5916c056407f89a7f5e8fdfce12ba825d7d6b9954a1a3c/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:1c95a1e2902a8b722868587c0e1184ad5c55631de5afc0eb96bc4b0d738092c0", size = 145147, upload-time = "2025-05-02T08:33:08.183Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ec/fe/1ac556fa4899d967b83e9893788e86b6af4d83e4726511eaaad035e36595/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ef8de666d6179b009dce7bcb2ad4c4a779f113f12caf8dc77f0162c29d20490b", size = 153057, upload-time = "2025-05-02T08:33:09.986Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/ff/acfc0b0a70b19e3e54febdd5301a98b72fa07635e56f24f60502e954c461/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:32fc0341d72e0f73f80acb0a2c94216bd704f4f0bce10aedea38f30502b271ff", size = 156454, upload-time = "2025-05-02T08:33:11.814Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/92/08/95b458ce9c740d0645feb0e96cea1f5ec946ea9c580a94adfe0b617f3573/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:289200a18fa698949d2b39c671c2cc7a24d44096784e76614899a7ccf2574b7b", size = 154174, upload-time = "2025-05-02T08:33:13.707Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/78/be/8392efc43487ac051eee6c36d5fbd63032d78f7728cb37aebcc98191f1ff/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4a476b06fbcf359ad25d34a057b7219281286ae2477cc5ff5e3f70a246971148", size = 149166, upload-time = "2025-05-02T08:33:15.458Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/44/96/392abd49b094d30b91d9fbda6a69519e95802250b777841cf3bda8fe136c/charset_normalizer-3.4.2-cp313-cp313-win32.whl", hash = "sha256:aaeeb6a479c7667fbe1099af9617c83aaca22182d6cf8c53966491a0f1b7ffb7", size = 98064, upload-time = "2025-05-02T08:33:17.06Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e9/b0/0200da600134e001d91851ddc797809e2fe0ea72de90e09bec5a2fbdaccb/charset_normalizer-3.4.2-cp313-cp313-win_amd64.whl", hash = "sha256:aa6af9e7d59f9c12b33ae4e9450619cf2488e2bbe9b44030905877f0b2324980", size = 105641, upload-time = "2025-05-02T08:33:18.753Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/20/94/c5790835a017658cbfabd07f3bfb549140c3ac458cfc196323996b10095a/charset_normalizer-3.4.2-py3-none-any.whl", hash = "sha256:7f56930ab0abd1c45cd15be65cc741c28b1c9a34876ce8c17a2fa107810c0af0", size = 52626, upload-time = "2025-05-02T08:34:40.053Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/db/3c/33bac158f8ab7f89b2e59426d5fe2e4f63f7ed25df84c036890172b412b5/cfgv-3.5.0-py2.py3-none-any.whl", hash = "sha256:a8dc6b26ad22ff227d2634a65cb388215ce6cc96bbcc5cfde7641ae87e8dacc0", size = 7445, upload-time = "2025-11-19T20:55:50.744Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "charset-normalizer"
|
||||
version = "2.0.12"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/56/31/7bcaf657fafb3c6db8c787a865434290b726653c912085fbd371e9b92e1c/charset-normalizer-2.0.12.tar.gz", hash = "sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597", size = 79105, upload-time = "2022-02-12T14:33:13.788Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/06/b3/24afc8868eba069a7f03650ac750a778862dc34941a4bebeb58706715726/charset_normalizer-2.0.12-py3-none-any.whl", hash = "sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df", size = 39623, upload-time = "2022-02-12T14:33:12.294Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -324,6 +353,80 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a7/06/3d6badcf13db419e25b07041d9c7b4a2c331d3f4e7134445ec5df57714cd/coloredlogs-15.0.1-py2.py3-none-any.whl", hash = "sha256:612ee75c546f53e92e70049c9dbfcc18c935a2b9a53b66085ce9ef6a6e5c0934", size = 46018, upload-time = "2021-06-11T10:22:42.561Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "coverage"
|
||||
version = "7.13.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b6/45/2c665ca77ec32ad67e25c77daf1cee28ee4558f3bc571cdbaf88a00b9f23/coverage-7.13.0.tar.gz", hash = "sha256:a394aa27f2d7ff9bc04cf703817773a59ad6dfbd577032e690f961d2460ee936", size = 820905, upload-time = "2025-12-08T13:14:38.055Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/9b/f1/2619559f17f31ba00fc40908efd1fbf1d0a5536eb75dc8341e7d660a08de/coverage-7.13.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:0b3d67d31383c4c68e19a88e28fc4c2e29517580f1b0ebec4a069d502ce1e0bf", size = 218274, upload-time = "2025-12-08T13:12:52.095Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/11/30d71ae5d6e949ff93b2a79a2c1b4822e00423116c5c6edfaeef37301396/coverage-7.13.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:581f086833d24a22c89ae0fe2142cfaa1c92c930adf637ddf122d55083fb5a0f", size = 218638, upload-time = "2025-12-08T13:12:53.418Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/79/c2/fce80fc6ded8d77e53207489d6065d0fed75db8951457f9213776615e0f5/coverage-7.13.0-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:0a3a30f0e257df382f5f9534d4ce3d4cf06eafaf5192beb1a7bd066cb10e78fb", size = 250129, upload-time = "2025-12-08T13:12:54.744Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5b/b6/51b5d1eb6fcbb9a1d5d6984e26cbe09018475c2922d554fd724dd0f056ee/coverage-7.13.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:583221913fbc8f53b88c42e8dbb8fca1d0f2e597cb190ce45916662b8b9d9621", size = 252885, upload-time = "2025-12-08T13:12:56.401Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/f8/972a5affea41de798691ab15d023d3530f9f56a72e12e243f35031846ff7/coverage-7.13.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5f5d9bd30756fff3e7216491a0d6d520c448d5124d3d8e8f56446d6412499e74", size = 253974, upload-time = "2025-12-08T13:12:57.718Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/56/116513aee860b2c7968aa3506b0f59b22a959261d1dbf3aea7b4450a7520/coverage-7.13.0-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a23e5a1f8b982d56fa64f8e442e037f6ce29322f1f9e6c2344cd9e9f4407ee57", size = 250538, upload-time = "2025-12-08T13:12:59.254Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/75/074476d64248fbadf16dfafbf93fdcede389ec821f74ca858d7c87d2a98c/coverage-7.13.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:9b01c22bc74a7fb44066aaf765224c0d933ddf1f5047d6cdfe4795504a4493f8", size = 251912, upload-time = "2025-12-08T13:13:00.604Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f2/d2/aa4f8acd1f7c06024705c12609d8698c51b27e4d635d717cd1934c9668e2/coverage-7.13.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:898cce66d0836973f48dda4e3514d863d70142bdf6dfab932b9b6a90ea5b222d", size = 250054, upload-time = "2025-12-08T13:13:01.892Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/19/98/8df9e1af6a493b03694a1e8070e024e7d2cdc77adedc225a35e616d505de/coverage-7.13.0-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:3ab483ea0e251b5790c2aac03acde31bff0c736bf8a86829b89382b407cd1c3b", size = 249619, upload-time = "2025-12-08T13:13:03.236Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d8/71/f8679231f3353018ca66ef647fa6fe7b77e6bff7845be54ab84f86233363/coverage-7.13.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:1d84e91521c5e4cb6602fe11ece3e1de03b2760e14ae4fcf1a4b56fa3c801fcd", size = 251496, upload-time = "2025-12-08T13:13:04.511Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/04/86/9cb406388034eaf3c606c22094edbbb82eea1fa9d20c0e9efadff20d0733/coverage-7.13.0-cp312-cp312-win32.whl", hash = "sha256:193c3887285eec1dbdb3f2bd7fbc351d570ca9c02ca756c3afbc71b3c98af6ef", size = 220808, upload-time = "2025-12-08T13:13:06.422Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1c/59/af483673df6455795daf5f447c2f81a3d2fcfc893a22b8ace983791f6f34/coverage-7.13.0-cp312-cp312-win_amd64.whl", hash = "sha256:4f3e223b2b2db5e0db0c2b97286aba0036ca000f06aca9b12112eaa9af3d92ae", size = 221616, upload-time = "2025-12-08T13:13:07.95Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/64/b0/959d582572b30a6830398c60dd419c1965ca4b5fb38ac6b7093a0d50ca8d/coverage-7.13.0-cp312-cp312-win_arm64.whl", hash = "sha256:086cede306d96202e15a4b77ace8472e39d9f4e5f9fd92dd4fecdfb2313b2080", size = 220261, upload-time = "2025-12-08T13:13:09.581Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/cc/bce226595eb3bf7d13ccffe154c3c487a22222d87ff018525ab4dd2e9542/coverage-7.13.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:28ee1c96109974af104028a8ef57cec21447d42d0e937c0275329272e370ebcf", size = 218297, upload-time = "2025-12-08T13:13:10.977Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/9f/73c4d34600aae03447dff3d7ad1d0ac649856bfb87d1ca7d681cfc913f9e/coverage-7.13.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d1e97353dcc5587b85986cda4ff3ec98081d7e84dd95e8b2a6d59820f0545f8a", size = 218673, upload-time = "2025-12-08T13:13:12.562Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/63/ab/8fa097db361a1e8586535ae5073559e6229596b3489ec3ef2f5b38df8cb2/coverage-7.13.0-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:99acd4dfdfeb58e1937629eb1ab6ab0899b131f183ee5f23e0b5da5cba2fec74", size = 249652, upload-time = "2025-12-08T13:13:13.909Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/90/3a/9bfd4de2ff191feb37ef9465855ca56a6f2f30a3bca172e474130731ac3d/coverage-7.13.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:ff45e0cd8451e293b63ced93161e189780baf444119391b3e7d25315060368a6", size = 252251, upload-time = "2025-12-08T13:13:15.553Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/61/b5d8105f016e1b5874af0d7c67542da780ccd4a5f2244a433d3e20ceb1ad/coverage-7.13.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f4f72a85316d8e13234cafe0a9f81b40418ad7a082792fa4165bd7d45d96066b", size = 253492, upload-time = "2025-12-08T13:13:16.849Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f3/b8/0fad449981803cc47a4694768b99823fb23632150743f9c83af329bb6090/coverage-7.13.0-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:11c21557d0e0a5a38632cbbaca5f008723b26a89d70db6315523df6df77d6232", size = 249850, upload-time = "2025-12-08T13:13:18.142Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9a/e9/8d68337c3125014d918cf4327d5257553a710a2995a6a6de2ac77e5aa429/coverage-7.13.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:76541dc8d53715fb4f7a3a06b34b0dc6846e3c69bc6204c55653a85dd6220971", size = 251633, upload-time = "2025-12-08T13:13:19.56Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/55/14/d4112ab26b3a1bc4b3c1295d8452dcf399ed25be4cf649002fb3e64b2d93/coverage-7.13.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:6e9e451dee940a86789134b6b0ffbe31c454ade3b849bb8a9d2cca2541a8e91d", size = 249586, upload-time = "2025-12-08T13:13:20.883Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/a9/22b0000186db663b0d82f86c2f1028099ae9ac202491685051e2a11a5218/coverage-7.13.0-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:5c67dace46f361125e6b9cace8fe0b729ed8479f47e70c89b838d319375c8137", size = 249412, upload-time = "2025-12-08T13:13:22.22Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/2e/42d8e0d9e7527fba439acdc6ed24a2b97613b1dc85849b1dd935c2cffef0/coverage-7.13.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f59883c643cb19630500f57016f76cfdcd6845ca8c5b5ea1f6e17f74c8e5f511", size = 251191, upload-time = "2025-12-08T13:13:23.899Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a4/af/8c7af92b1377fd8860536aadd58745119252aaaa71a5213e5a8e8007a9f5/coverage-7.13.0-cp313-cp313-win32.whl", hash = "sha256:58632b187be6f0be500f553be41e277712baa278147ecb7559983c6d9faf7ae1", size = 220829, upload-time = "2025-12-08T13:13:25.182Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/58/f9/725e8bf16f343d33cbe076c75dc8370262e194ff10072c0608b8e5cf33a3/coverage-7.13.0-cp313-cp313-win_amd64.whl", hash = "sha256:73419b89f812f498aca53f757dd834919b48ce4799f9d5cad33ca0ae442bdb1a", size = 221640, upload-time = "2025-12-08T13:13:26.836Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/ff/e98311000aa6933cc79274e2b6b94a2fe0fe3434fca778eba82003675496/coverage-7.13.0-cp313-cp313-win_arm64.whl", hash = "sha256:eb76670874fdd6091eedcc856128ee48c41a9bbbb9c3f1c7c3cf169290e3ffd6", size = 220269, upload-time = "2025-12-08T13:13:28.116Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cf/cf/bbaa2e1275b300343ea865f7d424cc0a2e2a1df6925a070b2b2d5d765330/coverage-7.13.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:6e63ccc6e0ad8986386461c3c4b737540f20426e7ec932f42e030320896c311a", size = 218990, upload-time = "2025-12-08T13:13:29.463Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/21/1d/82f0b3323b3d149d7672e7744c116e9c170f4957e0c42572f0366dbb4477/coverage-7.13.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:494f5459ffa1bd45e18558cd98710c36c0b8fbfa82a5eabcbe671d80ecffbfe8", size = 219340, upload-time = "2025-12-08T13:13:31.524Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fb/e3/fe3fd4702a3832a255f4d43013eacb0ef5fc155a5960ea9269d8696db28b/coverage-7.13.0-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:06cac81bf10f74034e055e903f5f946e3e26fc51c09fc9f584e4a1605d977053", size = 260638, upload-time = "2025-12-08T13:13:32.965Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ad/01/63186cb000307f2b4da463f72af9b85d380236965574c78e7e27680a2593/coverage-7.13.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:f2ffc92b46ed6e6760f1d47a71e56b5664781bc68986dbd1836b2b70c0ce2071", size = 262705, upload-time = "2025-12-08T13:13:34.378Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/a1/c0dacef0cc865f2455d59eed3548573ce47ed603205ffd0735d1d78b5906/coverage-7.13.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0602f701057c6823e5db1b74530ce85f17c3c5be5c85fc042ac939cbd909426e", size = 265125, upload-time = "2025-12-08T13:13:35.73Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ef/92/82b99223628b61300bd382c205795533bed021505eab6dd86e11fb5d7925/coverage-7.13.0-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:25dc33618d45456ccb1d37bce44bc78cf269909aa14c4db2e03d63146a8a1493", size = 259844, upload-time = "2025-12-08T13:13:37.69Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cf/2c/89b0291ae4e6cd59ef042708e1c438e2290f8c31959a20055d8768349ee2/coverage-7.13.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:71936a8b3b977ddd0b694c28c6a34f4fff2e9dd201969a4ff5d5fc7742d614b0", size = 262700, upload-time = "2025-12-08T13:13:39.525Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bf/f9/a5f992efae1996245e796bae34ceb942b05db275e4b34222a9a40b9fbd3b/coverage-7.13.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:936bc20503ce24770c71938d1369461f0c5320830800933bc3956e2a4ded930e", size = 260321, upload-time = "2025-12-08T13:13:41.172Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/89/a29f5d98c64fedbe32e2ac3c227fbf78edc01cc7572eee17d61024d89889/coverage-7.13.0-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:af0a583efaacc52ae2521f8d7910aff65cdb093091d76291ac5820d5e947fc1c", size = 259222, upload-time = "2025-12-08T13:13:43.282Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b3/c3/940fe447aae302a6701ee51e53af7e08b86ff6eed7631e5740c157ee22b9/coverage-7.13.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:f1c23e24a7000da892a312fb17e33c5f94f8b001de44b7cf8ba2e36fbd15859e", size = 261411, upload-time = "2025-12-08T13:13:44.72Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/eb/31/12a4aec689cb942a89129587860ed4d0fd522d5fda81237147fde554b8ae/coverage-7.13.0-cp313-cp313t-win32.whl", hash = "sha256:5f8a0297355e652001015e93be345ee54393e45dc3050af4a0475c5a2b767d46", size = 221505, upload-time = "2025-12-08T13:13:46.332Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/65/8c/3b5fe3259d863572d2b0827642c50c3855d26b3aefe80bdc9eba1f0af3b0/coverage-7.13.0-cp313-cp313t-win_amd64.whl", hash = "sha256:6abb3a4c52f05e08460bd9acf04fec027f8718ecaa0d09c40ffbc3fbd70ecc39", size = 222569, upload-time = "2025-12-08T13:13:47.79Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b0/39/f71fa8316a96ac72fc3908839df651e8eccee650001a17f2c78cdb355624/coverage-7.13.0-cp313-cp313t-win_arm64.whl", hash = "sha256:3ad968d1e3aa6ce5be295ab5fe3ae1bf5bb4769d0f98a80a0252d543a2ef2e9e", size = 220841, upload-time = "2025-12-08T13:13:49.243Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f8/4b/9b54bedda55421449811dcd5263a2798a63f48896c24dfb92b0f1b0845bd/coverage-7.13.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:453b7ec753cf5e4356e14fe858064e5520c460d3bbbcb9c35e55c0d21155c256", size = 218343, upload-time = "2025-12-08T13:13:50.811Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/59/df/c3a1f34d4bba2e592c8979f924da4d3d4598b0df2392fbddb7761258e3dc/coverage-7.13.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:af827b7cbb303e1befa6c4f94fd2bf72f108089cfa0f8abab8f4ca553cf5ca5a", size = 218672, upload-time = "2025-12-08T13:13:52.284Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/62/eec0659e47857698645ff4e6ad02e30186eb8afd65214fd43f02a76537cb/coverage-7.13.0-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:9987a9e4f8197a1000280f7cc089e3ea2c8b3c0a64d750537809879a7b4ceaf9", size = 249715, upload-time = "2025-12-08T13:13:53.791Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/23/2d/3c7ff8b2e0e634c1f58d095f071f52ed3c23ff25be524b0ccae8b71f99f8/coverage-7.13.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:3188936845cd0cb114fa6a51842a304cdbac2958145d03be2377ec41eb285d19", size = 252225, upload-time = "2025-12-08T13:13:55.274Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/aa/ac/fb03b469d20e9c9a81093575003f959cf91a4a517b783aab090e4538764b/coverage-7.13.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a2bdb3babb74079f021696cb46b8bb5f5661165c385d3a238712b031a12355be", size = 253559, upload-time = "2025-12-08T13:13:57.161Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/29/62/14afa9e792383c66cc0a3b872a06ded6e4ed1079c7d35de274f11d27064e/coverage-7.13.0-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:7464663eaca6adba4175f6c19354feea61ebbdd735563a03d1e472c7072d27bb", size = 249724, upload-time = "2025-12-08T13:13:58.692Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/31/b7/333f3dab2939070613696ab3ee91738950f0467778c6e5a5052e840646b7/coverage-7.13.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:8069e831f205d2ff1f3d355e82f511eb7c5522d7d413f5db5756b772ec8697f8", size = 251582, upload-time = "2025-12-08T13:14:00.642Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/cb/69162bda9381f39b2287265d7e29ee770f7c27c19f470164350a38318764/coverage-7.13.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:6fb2d5d272341565f08e962cce14cdf843a08ac43bd621783527adb06b089c4b", size = 249538, upload-time = "2025-12-08T13:14:02.556Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/76/350387b56a30f4970abe32b90b2a434f87d29f8b7d4ae40d2e8a85aacfb3/coverage-7.13.0-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:5e70f92ef89bac1ac8a99b3324923b4749f008fdbd7aa9cb35e01d7a284a04f9", size = 249349, upload-time = "2025-12-08T13:14:04.015Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/0d/7f6c42b8d59f4c7e43ea3059f573c0dcfed98ba46eb43c68c69e52ae095c/coverage-7.13.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:4b5de7d4583e60d5fd246dd57fcd3a8aa23c6e118a8c72b38adf666ba8e7e927", size = 251011, upload-time = "2025-12-08T13:14:05.505Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d7/f1/4bb2dff379721bb0b5c649d5c5eaf438462cad824acf32eb1b7ca0c7078e/coverage-7.13.0-cp314-cp314-win32.whl", hash = "sha256:a6c6e16b663be828a8f0b6c5027d36471d4a9f90d28444aa4ced4d48d7d6ae8f", size = 221091, upload-time = "2025-12-08T13:14:07.127Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/44/c239da52f373ce379c194b0ee3bcc121020e397242b85f99e0afc8615066/coverage-7.13.0-cp314-cp314-win_amd64.whl", hash = "sha256:0900872f2fdb3ee5646b557918d02279dc3af3dfb39029ac4e945458b13f73bc", size = 221904, upload-time = "2025-12-08T13:14:08.542Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/89/1f/b9f04016d2a29c2e4a0307baefefad1a4ec5724946a2b3e482690486cade/coverage-7.13.0-cp314-cp314-win_arm64.whl", hash = "sha256:3a10260e6a152e5f03f26db4a407c4c62d3830b9af9b7c0450b183615f05d43b", size = 220480, upload-time = "2025-12-08T13:14:10.958Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/16/d4/364a1439766c8e8647860584171c36010ca3226e6e45b1753b1b249c5161/coverage-7.13.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:9097818b6cc1cfb5f174e3263eba4a62a17683bcfe5c4b5d07f4c97fa51fbf28", size = 219074, upload-time = "2025-12-08T13:14:13.345Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ce/f4/71ba8be63351e099911051b2089662c03d5671437a0ec2171823c8e03bec/coverage-7.13.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:0018f73dfb4301a89292c73be6ba5f58722ff79f51593352759c1790ded1cabe", size = 219342, upload-time = "2025-12-08T13:14:15.02Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5e/25/127d8ed03d7711a387d96f132589057213e3aef7475afdaa303412463f22/coverage-7.13.0-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:166ad2a22ee770f5656e1257703139d3533b4a0b6909af67c6b4a3adc1c98657", size = 260713, upload-time = "2025-12-08T13:14:16.907Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fd/db/559fbb6def07d25b2243663b46ba9eb5a3c6586c0c6f4e62980a68f0ee1c/coverage-7.13.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:f6aaef16d65d1787280943f1c8718dc32e9cf141014e4634d64446702d26e0ff", size = 262825, upload-time = "2025-12-08T13:14:18.68Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/37/99/6ee5bf7eff884766edb43bd8736b5e1c5144d0fe47498c3779326fe75a35/coverage-7.13.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e999e2dcc094002d6e2c7bbc1fb85b58ba4f465a760a8014d97619330cdbbbf3", size = 265233, upload-time = "2025-12-08T13:14:20.55Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d8/90/92f18fe0356ea69e1f98f688ed80cec39f44e9f09a1f26a1bbf017cc67f2/coverage-7.13.0-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:00c3d22cf6fb1cf3bf662aaaa4e563be8243a5ed2630339069799835a9cc7f9b", size = 259779, upload-time = "2025-12-08T13:14:22.367Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/90/5d/b312a8b45b37a42ea7d27d7d3ff98ade3a6c892dd48d1d503e773503373f/coverage-7.13.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:22ccfe8d9bb0d6134892cbe1262493a8c70d736b9df930f3f3afae0fe3ac924d", size = 262700, upload-time = "2025-12-08T13:14:24.309Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/63/f8/b1d0de5c39351eb71c366f872376d09386640840a2e09b0d03973d791e20/coverage-7.13.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:9372dff5ea15930fea0445eaf37bbbafbc771a49e70c0aeed8b4e2c2614cc00e", size = 260302, upload-time = "2025-12-08T13:14:26.068Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/aa/7c/d42f4435bc40c55558b3109a39e2d456cddcec37434f62a1f1230991667a/coverage-7.13.0-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:69ac2c492918c2461bc6ace42d0479638e60719f2a4ef3f0815fa2df88e9f940", size = 259136, upload-time = "2025-12-08T13:14:27.604Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b8/d3/23413241dc04d47cfe19b9a65b32a2edd67ecd0b817400c2843ebc58c847/coverage-7.13.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:739c6c051a7540608d097b8e13c76cfa85263ced467168dc6b477bae3df7d0e2", size = 261467, upload-time = "2025-12-08T13:14:29.09Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/13/e6/6e063174500eee216b96272c0d1847bf215926786f85c2bd024cf4d02d2f/coverage-7.13.0-cp314-cp314t-win32.whl", hash = "sha256:fe81055d8c6c9de76d60c94ddea73c290b416e061d40d542b24a5871bad498b7", size = 221875, upload-time = "2025-12-08T13:14:31.106Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/46/f4fb293e4cbe3620e3ac2a3e8fd566ed33affb5861a9b20e3dd6c1896cbc/coverage-7.13.0-cp314-cp314t-win_amd64.whl", hash = "sha256:445badb539005283825959ac9fa4a28f712c214b65af3a2c464f1adc90f5fcbc", size = 222982, upload-time = "2025-12-08T13:14:33.1Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/68/62/5b3b9018215ed9733fbd1ae3b2ed75c5de62c3b55377a52cae732e1b7805/coverage-7.13.0-cp314-cp314t-win_arm64.whl", hash = "sha256:de7f6748b890708578fc4b7bb967d810aeb6fcc9bff4bb77dbca77dab2f9df6a", size = 221016, upload-time = "2025-12-08T13:14:34.601Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/4c/1968f32fb9a2604645827e11ff84a31e59d532e01995f904723b4f5328b3/coverage-7.13.0-py3-none-any.whl", hash = "sha256:850d2998f380b1e266459ca5b47bc9e7daf9af1d070f66317972f382d46f1904", size = 210068, upload-time = "2025-12-08T13:14:36.236Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cryptography"
|
||||
version = "44.0.3"
|
||||
@@ -368,6 +471,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/07/6c/aa3f2f849e01cb6a001cd8554a88d4c77c5c1a31c95bdf1cf9301e6d9ef4/defusedxml-0.7.1-py2.py3-none-any.whl", hash = "sha256:a352e7e428770286cc899e2542b6cdaedb2b4953ff269a210103ec58f6198a61", size = 25604, upload-time = "2021-03-08T10:59:24.45Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "distlib"
|
||||
version = "0.4.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/96/8e/709914eb2b5749865801041647dc7f4e6d00b549cfe88b65ca192995f07c/distlib-0.4.0.tar.gz", hash = "sha256:feec40075be03a04501a973d81f633735b4b69f98b05450592310c0f401a4e0d", size = 614605, upload-time = "2025-07-17T16:52:00.465Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/33/6b/e0547afaf41bf2c42e52430072fa5658766e3d65bd4b03a563d1b6336f57/distlib-0.4.0-py2.py3-none-any.whl", hash = "sha256:9659f7d87e46584a30b5780e43ac7a2143098441670ff0a49d5f9034c54a6c16", size = 469047, upload-time = "2025-07-17T16:51:58.613Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "distro"
|
||||
version = "1.9.0"
|
||||
@@ -377,6 +489,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/12/b3/231ffd4ab1fc9d679809f356cebee130ac7daa00d6d6f3206dd4fd137e9e/distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2", size = 20277, upload-time = "2023-12-24T09:54:30.421Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "docutils"
|
||||
version = "0.22.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/d9/02/111134bfeb6e6c7ac4c74594e39a59f6c0195dc4846afbeac3cba60f1927/docutils-0.22.3.tar.gz", hash = "sha256:21486ae730e4ca9f622677b1412b879af1791efcfba517e4c6f60be543fc8cdd", size = 2290153, upload-time = "2025-11-06T02:35:55.655Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/11/a8/c6a4b901d17399c77cd81fb001ce8961e9f5e04d3daf27e8925cb012e163/docutils-0.22.3-py3-none-any.whl", hash = "sha256:bd772e4aca73aff037958d44f2be5229ded4c09927fcf8690c577b66234d6ceb", size = 633032, upload-time = "2025-11-06T02:35:52.391Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "et-xmlfile"
|
||||
version = "2.0.0"
|
||||
@@ -386,6 +507,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/8b/5fe2cc11fee489817272089c4203e679c63b570a5aaeb18d852ae3cbba6a/et_xmlfile-2.0.0-py3-none-any.whl", hash = "sha256:7a91720bc756843502c3b7504c77b8fe44217c85c537d85037f0f536151b2caa", size = 18059, upload-time = "2024-10-25T17:25:39.051Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "filelock"
|
||||
version = "3.20.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/58/46/0028a82567109b5ef6e4d2a1f04a583fb513e6cf9527fcdd09afd817deeb/filelock-3.20.0.tar.gz", hash = "sha256:711e943b4ec6be42e1d4e6690b48dc175c822967466bb31c0c293f34334c13f4", size = 18922, upload-time = "2025-10-08T18:03:50.056Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/76/91/7216b27286936c16f5b4d0c530087e4a54eead683e6b0b73dd0c64844af6/filelock-3.20.0-py3-none-any.whl", hash = "sha256:339b4732ffda5cd79b13f4e2711a31b0365ce445d95d243bb996273d072546a2", size = 16054, upload-time = "2025-10-08T18:03:48.35Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "flatbuffers"
|
||||
version = "25.2.10"
|
||||
@@ -461,6 +591,8 @@ version = "0.1.0"
|
||||
source = { virtual = "." }
|
||||
dependencies = [
|
||||
{ name = "aiohttp" },
|
||||
{ name = "certifi" },
|
||||
{ name = "click" },
|
||||
{ name = "html2text" },
|
||||
{ name = "mammoth" },
|
||||
{ name = "markitdown", extra = ["all"] },
|
||||
@@ -470,20 +602,32 @@ dependencies = [
|
||||
{ name = "pillow" },
|
||||
{ name = "python-dateutil" },
|
||||
{ name = "python-docx" },
|
||||
{ name = "requests" },
|
||||
{ name = "rich" },
|
||||
{ name = "textual" },
|
||||
{ name = "textual-image" },
|
||||
{ name = "ticktick-py" },
|
||||
]
|
||||
|
||||
[package.dev-dependencies]
|
||||
dev = [
|
||||
{ name = "black" },
|
||||
{ name = "build" },
|
||||
{ name = "mypy" },
|
||||
{ name = "pre-commit" },
|
||||
{ name = "pytest" },
|
||||
{ name = "pytest-asyncio" },
|
||||
{ name = "pytest-cov" },
|
||||
{ name = "ruff" },
|
||||
{ name = "textual" },
|
||||
{ name = "twine" },
|
||||
]
|
||||
|
||||
[package.metadata]
|
||||
requires-dist = [
|
||||
{ name = "aiohttp", specifier = ">=3.11.18" },
|
||||
{ name = "certifi", specifier = ">=2025.4.26" },
|
||||
{ name = "click", specifier = ">=8.1.0" },
|
||||
{ name = "html2text", specifier = ">=2025.4.15" },
|
||||
{ name = "mammoth", specifier = ">=1.9.0" },
|
||||
{ name = "markitdown", extras = ["all"], specifier = ">=0.1.1" },
|
||||
@@ -493,15 +637,25 @@ requires-dist = [
|
||||
{ name = "pillow", specifier = ">=11.2.1" },
|
||||
{ name = "python-dateutil", specifier = ">=2.9.0.post0" },
|
||||
{ name = "python-docx", specifier = ">=1.1.2" },
|
||||
{ name = "requests", specifier = ">=2.31.0" },
|
||||
{ name = "rich", specifier = ">=14.0.0" },
|
||||
{ name = "textual", specifier = ">=3.2.0" },
|
||||
{ name = "textual-image", specifier = ">=0.8.2" },
|
||||
{ name = "ticktick-py", specifier = ">=2.0.0" },
|
||||
]
|
||||
|
||||
[package.metadata.requires-dev]
|
||||
dev = [
|
||||
{ name = "black", specifier = ">=24.0.0" },
|
||||
{ name = "build", specifier = ">=1.0.0" },
|
||||
{ name = "mypy", specifier = ">=1.8.0" },
|
||||
{ name = "pre-commit", specifier = ">=3.5.0" },
|
||||
{ name = "pytest", specifier = ">=8.0.0" },
|
||||
{ name = "pytest-asyncio", specifier = ">=0.24.0" },
|
||||
{ name = "pytest-cov", specifier = ">=6.0.0" },
|
||||
{ name = "ruff", specifier = ">=0.11.8" },
|
||||
{ name = "textual", specifier = ">=3.2.0" },
|
||||
{ name = "twine", specifier = ">=5.0.0" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -562,6 +716,27 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/f0/0f/310fb31e39e2d734ccaa2c0fb981ee41f7bd5056ce9bc29b2248bd569169/humanfriendly-10.0-py2.py3-none-any.whl", hash = "sha256:1697e1a8a8f550fd43c2865cd84542fc175a61dcb779b6fee18cf6b6ccba1477", size = 86794, upload-time = "2021-09-17T21:40:39.897Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "id"
|
||||
version = "1.5.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "requests" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/22/11/102da08f88412d875fa2f1a9a469ff7ad4c874b0ca6fed0048fe385bdb3d/id-1.5.0.tar.gz", hash = "sha256:292cb8a49eacbbdbce97244f47a97b4c62540169c976552e497fd57df0734c1d", size = 15237, upload-time = "2024-12-04T19:53:05.575Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/9f/cb/18326d2d89ad3b0dd143da971e77afd1e6ca6674f1b1c3df4b6bec6279fc/id-1.5.0-py3-none-any.whl", hash = "sha256:f1434e1cef91f2cbb8a4ec64663d5a23b9ed43ef44c4c957d02583d61714c658", size = 13611, upload-time = "2024-12-04T19:53:03.02Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "identify"
|
||||
version = "2.6.15"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ff/e7/685de97986c916a6d93b3876139e00eef26ad5bbbd61925d670ae8013449/identify-2.6.15.tar.gz", hash = "sha256:e4f4864b96c6557ef2a1e1c951771838f4edc9df3a72ec7118b338801b11c7bf", size = 99311, upload-time = "2025-10-02T17:43:40.631Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/0f/1c/e5fd8f973d4f375adb21565739498e2e9a1e54c858a97b9a8ccfdc81da9b/identify-2.6.15-py2.py3-none-any.whl", hash = "sha256:1181ef7608e00704db228516541eb83a88a9f94433a8c80bb9b5bd54b1d81757", size = 99183, upload-time = "2025-10-02T17:43:39.137Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "idna"
|
||||
version = "3.10"
|
||||
@@ -571,6 +746,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "iniconfig"
|
||||
version = "2.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/72/34/14ca021ce8e5dfedc35312d08ba8bf51fdd999c576889fc2c24cb97f4f10/iniconfig-2.3.0.tar.gz", hash = "sha256:c76315c77db068650d49c5b56314774a7804df16fee4402c1f19d6d15d8c4730", size = 20503, upload-time = "2025-10-18T21:55:43.219Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/b1/3846dd7f199d53cb17f49cba7e651e9ce294d8497c8c150530ed11865bb8/iniconfig-2.3.0-py3-none-any.whl", hash = "sha256:f631c04d2c48c52b84d0d0549c99ff3859c98df65b3101406327ecc7d53fbf12", size = 7484, upload-time = "2025-10-18T21:55:41.639Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "isodate"
|
||||
version = "0.7.2"
|
||||
@@ -580,6 +764,48 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/15/aa/0aca39a37d3c7eb941ba736ede56d689e7be91cab5d9ca846bde3999eba6/isodate-0.7.2-py3-none-any.whl", hash = "sha256:28009937d8031054830160fce6d409ed342816b543597cece116d966c6d99e15", size = 22320, upload-time = "2024-10-08T23:04:09.501Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "jaraco-classes"
|
||||
version = "3.4.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "more-itertools" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/06/c0/ed4a27bc5571b99e3cff68f8a9fa5b56ff7df1c2251cc715a652ddd26402/jaraco.classes-3.4.0.tar.gz", hash = "sha256:47a024b51d0239c0dd8c8540c6c7f484be3b8fcf0b2d85c13825780d3b3f3acd", size = 11780, upload-time = "2024-03-31T07:27:36.643Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/7f/66/b15ce62552d84bbfcec9a4873ab79d993a1dd4edb922cbfccae192bd5b5f/jaraco.classes-3.4.0-py3-none-any.whl", hash = "sha256:f662826b6bed8cace05e7ff873ce0f9283b5c924470fe664fff1c2f00f581790", size = 6777, upload-time = "2024-03-31T07:27:34.792Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "jaraco-context"
|
||||
version = "6.0.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/df/ad/f3777b81bf0b6e7bc7514a1656d3e637b2e8e15fab2ce3235730b3e7a4e6/jaraco_context-6.0.1.tar.gz", hash = "sha256:9bae4ea555cf0b14938dc0aee7c9f32ed303aa20a3b73e7dc80111628792d1b3", size = 13912, upload-time = "2024-08-20T03:39:27.358Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ff/db/0c52c4cf5e4bd9f5d7135ec7669a3a767af21b3a308e1ed3674881e52b62/jaraco.context-6.0.1-py3-none-any.whl", hash = "sha256:f797fc481b490edb305122c9181830a3a5b76d84ef6d1aef2fb9b47ab956f9e4", size = 6825, upload-time = "2024-08-20T03:39:25.966Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "jaraco-functools"
|
||||
version = "4.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "more-itertools" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f7/ed/1aa2d585304ec07262e1a83a9889880701079dde796ac7b1d1826f40c63d/jaraco_functools-4.3.0.tar.gz", hash = "sha256:cfd13ad0dd2c47a3600b439ef72d8615d482cedcff1632930d6f28924d92f294", size = 19755, upload-time = "2025-08-18T20:05:09.91Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/09/726f168acad366b11e420df31bf1c702a54d373a83f968d94141a8c3fde0/jaraco_functools-4.3.0-py3-none-any.whl", hash = "sha256:227ff8ed6f7b8f62c56deff101545fa7543cf2c8e7b82a7c2116e672f29c26e8", size = 10408, upload-time = "2025-08-18T20:05:08.69Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "jeepney"
|
||||
version = "0.9.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/7b/6f/357efd7602486741aa73ffc0617fb310a29b588ed0fd69c2399acbb85b0c/jeepney-0.9.0.tar.gz", hash = "sha256:cf0e9e845622b81e4a28df94c40345400256ec608d0e55bb8a3feaa9163f5732", size = 106758, upload-time = "2025-02-27T18:51:01.684Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/b2/a3/e137168c9c44d18eff0376253da9f1e9234d0239e0ee230d2fee6cea8e55/jeepney-0.9.0-py3-none-any.whl", hash = "sha256:97e5714520c16fc0a45695e5365a2e11b81ea79bba796e26f9f1d178cb182683", size = 49010, upload-time = "2025-02-27T18:51:00.104Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "jiter"
|
||||
version = "0.9.0"
|
||||
@@ -615,6 +841,75 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/47/3729f00f35a696e68da15d64eb9283c330e776f3b5789bac7f2c0c4df209/jiter-0.9.0-cp313-cp313t-win_amd64.whl", hash = "sha256:6f7838bc467ab7e8ef9f387bd6de195c43bad82a569c1699cb822f6609dd4cdf", size = 206867, upload-time = "2025-03-10T21:36:25.843Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "keyring"
|
||||
version = "25.7.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "jaraco-classes" },
|
||||
{ name = "jaraco-context" },
|
||||
{ name = "jaraco-functools" },
|
||||
{ name = "jeepney", marker = "sys_platform == 'linux'" },
|
||||
{ name = "pywin32-ctypes", marker = "sys_platform == 'win32'" },
|
||||
{ name = "secretstorage", marker = "sys_platform == 'linux'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/43/4b/674af6ef2f97d56f0ab5153bf0bfa28ccb6c3ed4d1babf4305449668807b/keyring-25.7.0.tar.gz", hash = "sha256:fe01bd85eb3f8fb3dd0405defdeac9a5b4f6f0439edbb3149577f244a2e8245b", size = 63516, upload-time = "2025-11-16T16:26:09.482Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/81/db/e655086b7f3a705df045bf0933bdd9c2f79bb3c97bfef1384598bb79a217/keyring-25.7.0-py3-none-any.whl", hash = "sha256:be4a0b195f149690c166e850609a477c532ddbfbaed96a404d4e43f8d5e2689f", size = 39160, upload-time = "2025-11-16T16:26:08.402Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "librt"
|
||||
version = "0.7.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b3/d9/6f3d3fcf5e5543ed8a60cc70fa7d50508ed60b8a10e9af6d2058159ab54e/librt-0.7.3.tar.gz", hash = "sha256:3ec50cf65235ff5c02c5b747748d9222e564ad48597122a361269dd3aa808798", size = 144549, upload-time = "2025-12-06T19:04:45.553Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/29/90/ed8595fa4e35b6020317b5ea8d226a782dcbac7a997c19ae89fb07a41c66/librt-0.7.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:0fa9ac2e49a6bee56e47573a6786cb635e128a7b12a0dc7851090037c0d397a3", size = 55687, upload-time = "2025-12-06T19:03:39.245Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dd/f6/6a20702a07b41006cb001a759440cb6b5362530920978f64a2b2ae2bf729/librt-0.7.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2e980cf1ed1a2420a6424e2ed884629cdead291686f1048810a817de07b5eb18", size = 57127, upload-time = "2025-12-06T19:03:40.3Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/79/f3/b0c4703d5ffe9359b67bb2ccb86c42d4e930a363cfc72262ac3ba53cff3e/librt-0.7.3-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:e094e445c37c57e9ec612847812c301840239d34ccc5d153a982fa9814478c60", size = 165336, upload-time = "2025-12-06T19:03:41.369Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/02/69/3ba05b73ab29ccbe003856232cea4049769be5942d799e628d1470ed1694/librt-0.7.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:aca73d70c3f553552ba9133d4a09e767dcfeee352d8d8d3eb3f77e38a3beb3ed", size = 174237, upload-time = "2025-12-06T19:03:42.44Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/22/ad/d7c2671e7bf6c285ef408aa435e9cd3fdc06fd994601e1f2b242df12034f/librt-0.7.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c634a0a6db395fdaba0361aa78395597ee72c3aad651b9a307a3a7eaf5efd67e", size = 189017, upload-time = "2025-12-06T19:03:44.01Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/94/d13f57193148004592b618555f296b41d2d79b1dc814ff8b3273a0bf1546/librt-0.7.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a59a69deeb458c858b8fea6acf9e2acd5d755d76cd81a655256bc65c20dfff5b", size = 183983, upload-time = "2025-12-06T19:03:45.834Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/02/10/b612a9944ebd39fa143c7e2e2d33f2cb790205e025ddd903fb509a3a3bb3/librt-0.7.3-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:d91e60ac44bbe3a77a67af4a4c13114cbe9f6d540337ce22f2c9eaf7454ca71f", size = 177602, upload-time = "2025-12-06T19:03:46.944Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1f/48/77bc05c4cc232efae6c5592c0095034390992edbd5bae8d6cf1263bb7157/librt-0.7.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:703456146dc2bf430f7832fd1341adac5c893ec3c1430194fdcefba00012555c", size = 199282, upload-time = "2025-12-06T19:03:48.069Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/12/aa/05916ccd864227db1ffec2a303ae34f385c6b22d4e7ce9f07054dbcf083c/librt-0.7.3-cp312-cp312-win32.whl", hash = "sha256:b7c1239b64b70be7759554ad1a86288220bbb04d68518b527783c4ad3fb4f80b", size = 47879, upload-time = "2025-12-06T19:03:49.289Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/92/7f41c42d31ea818b3c4b9cc1562e9714bac3c676dd18f6d5dd3d0f2aa179/librt-0.7.3-cp312-cp312-win_amd64.whl", hash = "sha256:ef59c938f72bdbc6ab52dc50f81d0637fde0f194b02d636987cea2ab30f8f55a", size = 54972, upload-time = "2025-12-06T19:03:50.335Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3f/dc/53582bbfb422311afcbc92adb75711f04e989cec052f08ec0152fbc36c9c/librt-0.7.3-cp312-cp312-win_arm64.whl", hash = "sha256:ff21c554304e8226bf80c3a7754be27c6c3549a9fec563a03c06ee8f494da8fc", size = 48338, upload-time = "2025-12-06T19:03:51.431Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/93/7d/e0ce1837dfb452427db556e6d4c5301ba3b22fe8de318379fbd0593759b9/librt-0.7.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:56f2a47beda8409061bc1c865bef2d4bd9ff9255219402c0817e68ab5ad89aed", size = 55742, upload-time = "2025-12-06T19:03:52.459Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/c0/3564262301e507e1d5cf31c7d84cb12addf0d35e05ba53312494a2eba9a4/librt-0.7.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:14569ac5dd38cfccf0a14597a88038fb16811a6fede25c67b79c6d50fc2c8fdc", size = 57163, upload-time = "2025-12-06T19:03:53.516Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/ac/245e72b7e443d24a562f6047563c7f59833384053073ef9410476f68505b/librt-0.7.3-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:6038ccbd5968325a5d6fd393cf6e00b622a8de545f0994b89dd0f748dcf3e19e", size = 165840, upload-time = "2025-12-06T19:03:54.918Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/98/af/587e4491f40adba066ba39a450c66bad794c8d92094f936a201bfc7c2b5f/librt-0.7.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d39079379a9a28e74f4d57dc6357fa310a1977b51ff12239d7271ec7e71d67f5", size = 174827, upload-time = "2025-12-06T19:03:56.082Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/78/21/5b8c60ea208bc83dd00421022a3874330685d7e856404128dc3728d5d1af/librt-0.7.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8837d5a52a2d7aa9f4c3220a8484013aed1d8ad75240d9a75ede63709ef89055", size = 189612, upload-time = "2025-12-06T19:03:57.507Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/2f/8b819169ef696421fb81cd04c6cdf225f6e96f197366001e9d45180d7e9e/librt-0.7.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:399bbd7bcc1633c3e356ae274a1deb8781c7bf84d9c7962cc1ae0c6e87837292", size = 184584, upload-time = "2025-12-06T19:03:58.686Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6c/fc/af9d225a9395b77bd7678362cb055d0b8139c2018c37665de110ca388022/librt-0.7.3-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:8d8cf653e798ee4c4e654062b633db36984a1572f68c3aa25e364a0ddfbbb910", size = 178269, upload-time = "2025-12-06T19:03:59.769Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6c/d8/7b4fa1683b772966749d5683aa3fd605813defffe157833a8fa69cc89207/librt-0.7.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:2f03484b54bf4ae80ab2e504a8d99d20d551bfe64a7ec91e218010b467d77093", size = 199852, upload-time = "2025-12-06T19:04:00.901Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/e8/4598413aece46ca38d9260ef6c51534bd5f34b5c21474fcf210ce3a02123/librt-0.7.3-cp313-cp313-win32.whl", hash = "sha256:44b3689b040df57f492e02cd4f0bacd1b42c5400e4b8048160c9d5e866de8abe", size = 47936, upload-time = "2025-12-06T19:04:02.054Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/af/80/ac0e92d5ef8c6791b3e2c62373863827a279265e0935acdf807901353b0e/librt-0.7.3-cp313-cp313-win_amd64.whl", hash = "sha256:6b407c23f16ccc36614c136251d6b32bf30de7a57f8e782378f1107be008ddb0", size = 54965, upload-time = "2025-12-06T19:04:03.224Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f1/fd/042f823fcbff25c1449bb4203a29919891ca74141b68d3a5f6612c4ce283/librt-0.7.3-cp313-cp313-win_arm64.whl", hash = "sha256:abfc57cab3c53c4546aee31859ef06753bfc136c9d208129bad23e2eca39155a", size = 48350, upload-time = "2025-12-06T19:04:04.234Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/ae/c6ecc7bb97134a71b5241e8855d39964c0e5f4d96558f0d60593892806d2/librt-0.7.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:120dd21d46ff875e849f1aae19346223cf15656be489242fe884036b23d39e93", size = 55175, upload-time = "2025-12-06T19:04:05.308Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cf/bc/2cc0cb0ab787b39aa5c7645cd792433c875982bdf12dccca558b89624594/librt-0.7.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1617bea5ab31266e152871208502ee943cb349c224846928a1173c864261375e", size = 56881, upload-time = "2025-12-06T19:04:06.674Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/87/397417a386190b70f5bf26fcedbaa1515f19dce33366e2684c6b7ee83086/librt-0.7.3-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:93b2a1f325fefa1482516ced160c8c7b4b8d53226763fa6c93d151fa25164207", size = 163710, upload-time = "2025-12-06T19:04:08.437Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c9/37/7338f85b80e8a17525d941211451199845093ca242b32efbf01df8531e72/librt-0.7.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f3d4801db8354436fd3936531e7f0e4feb411f62433a6b6cb32bb416e20b529f", size = 172471, upload-time = "2025-12-06T19:04:10.124Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/e0/741704edabbfae2c852fedc1b40d9ed5a783c70ed3ed8e4fe98f84b25d13/librt-0.7.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:11ad45122bbed42cfc8b0597450660126ef28fd2d9ae1a219bc5af8406f95678", size = 186804, upload-time = "2025-12-06T19:04:11.586Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/d1/0a82129d6ba242f3be9af34815be089f35051bc79619f5c27d2c449ecef6/librt-0.7.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:6b4e7bff1d76dd2b46443078519dc75df1b5e01562345f0bb740cea5266d8218", size = 181817, upload-time = "2025-12-06T19:04:12.802Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4f/32/704f80bcf9979c68d4357c46f2af788fbf9d5edda9e7de5786ed2255e911/librt-0.7.3-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:d86f94743a11873317094326456b23f8a5788bad9161fd2f0e52088c33564620", size = 175602, upload-time = "2025-12-06T19:04:14.004Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f7/6d/4355cfa0fae0c062ba72f541d13db5bc575770125a7ad3d4f46f4109d305/librt-0.7.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:754a0d09997095ad764ccef050dd5bf26cbf457aab9effcba5890dad081d879e", size = 196497, upload-time = "2025-12-06T19:04:15.487Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2e/eb/ac6d8517d44209e5a712fde46f26d0055e3e8969f24d715f70bd36056230/librt-0.7.3-cp314-cp314-win32.whl", hash = "sha256:fbd7351d43b80d9c64c3cfcb50008f786cc82cba0450e8599fdd64f264320bd3", size = 44678, upload-time = "2025-12-06T19:04:16.688Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e9/93/238f026d141faf9958da588c761a0812a1a21c98cc54a76f3608454e4e59/librt-0.7.3-cp314-cp314-win_amd64.whl", hash = "sha256:d376a35c6561e81d2590506804b428fc1075fcc6298fc5bb49b771534c0ba010", size = 51689, upload-time = "2025-12-06T19:04:17.726Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/52/44/43f462ad9dcf9ed7d3172fe2e30d77b980956250bd90e9889a9cca93df2a/librt-0.7.3-cp314-cp314-win_arm64.whl", hash = "sha256:cbdb3f337c88b43c3b49ca377731912c101178be91cb5071aac48faa898e6f8e", size = 44662, upload-time = "2025-12-06T19:04:18.771Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1d/35/fed6348915f96b7323241de97f26e2af481e95183b34991df12fd5ce31b1/librt-0.7.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9f0e0927efe87cd42ad600628e595a1a0aa1c64f6d0b55f7e6059079a428641a", size = 57347, upload-time = "2025-12-06T19:04:19.812Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9a/f2/045383ccc83e3fea4fba1b761796584bc26817b6b2efb6b8a6731431d16f/librt-0.7.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:020c6db391268bcc8ce75105cb572df8cb659a43fd347366aaa407c366e5117a", size = 59223, upload-time = "2025-12-06T19:04:20.862Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/3f/c081f8455ab1d7f4a10dbe58463ff97119272ff32494f21839c3b9029c2c/librt-0.7.3-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:7af7785f5edd1f418da09a8cdb9ec84b0213e23d597413e06525340bcce1ea4f", size = 183861, upload-time = "2025-12-06T19:04:21.963Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1d/f5/73c5093c22c31fbeaebc25168837f05ebfd8bf26ce00855ef97a5308f36f/librt-0.7.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8ccadf260bb46a61b9c7e89e2218f6efea9f3eeaaab4e3d1f58571890e54858e", size = 194594, upload-time = "2025-12-06T19:04:23.14Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/78/b8/d5f17d4afe16612a4a94abfded94c16c5a033f183074fb130dfe56fc1a42/librt-0.7.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d9883b2d819ce83f87ba82a746c81d14ada78784db431e57cc9719179847376e", size = 206759, upload-time = "2025-12-06T19:04:24.328Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/36/2e/021765c1be85ee23ffd5b5b968bb4cba7526a4db2a0fc27dcafbdfc32da7/librt-0.7.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:59cb0470612d21fa1efddfa0dd710756b50d9c7fb6c1236bbf8ef8529331dc70", size = 203210, upload-time = "2025-12-06T19:04:25.544Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/f0/9923656e42da4fd18c594bd08cf6d7e152d4158f8b808e210d967f0dcceb/librt-0.7.3-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:1fe603877e1865b5fd047a5e40379509a4a60204aa7aa0f72b16f7a41c3f0712", size = 196708, upload-time = "2025-12-06T19:04:26.725Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fc/0b/0708b886ac760e64d6fbe7e16024e4be3ad1a3629d19489a97e9cf4c3431/librt-0.7.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5460d99ed30f043595bbdc888f542bad2caeb6226b01c33cda3ae444e8f82d42", size = 217212, upload-time = "2025-12-06T19:04:27.892Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/7f/12a73ff17bca4351e73d585dd9ebf46723c4a8622c4af7fe11a2e2d011ff/librt-0.7.3-cp314-cp314t-win32.whl", hash = "sha256:d09f677693328503c9e492e33e9601464297c01f9ebd966ea8fc5308f3069bfd", size = 45586, upload-time = "2025-12-06T19:04:29.116Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e2/df/8decd032ac9b995e4f5606cde783711a71094128d88d97a52e397daf2c89/librt-0.7.3-cp314-cp314t-win_amd64.whl", hash = "sha256:25711f364c64cab2c910a0247e90b51421e45dbc8910ceeb4eac97a9e132fc6f", size = 53002, upload-time = "2025-12-06T19:04:30.173Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/de/0c/6605b6199de8178afe7efc77ca1d8e6db00453bc1d3349d27605c0f42104/librt-0.7.3-cp314-cp314t-win_arm64.whl", hash = "sha256:a9f9b661f82693eb56beb0605156c7fca57f535704ab91837405913417d6990b", size = 45647, upload-time = "2025-12-06T19:04:31.302Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "linkify-it-py"
|
||||
version = "2.0.3"
|
||||
@@ -785,6 +1080,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "more-itertools"
|
||||
version = "10.8.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ea/5d/38b681d3fce7a266dd9ab73c66959406d565b3e85f21d5e66e1181d93721/more_itertools-10.8.0.tar.gz", hash = "sha256:f638ddf8a1a0d134181275fb5d58b086ead7c6a72429ad725c67503f13ba30bd", size = 137431, upload-time = "2025-09-02T15:23:11.018Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a4/8e/469e5a4a2f5855992e425f3cb33804cc07bf18d48f2db061aec61ce50270/more_itertools-10.8.0-py3-none-any.whl", hash = "sha256:52d4362373dcf7c52546bc4af9a86ee7c4579df9a8dc268be0a2f949d376cc9b", size = 69667, upload-time = "2025-09-02T15:23:09.635Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mpmath"
|
||||
version = "1.3.0"
|
||||
@@ -880,6 +1184,90 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/96/10/7d526c8974f017f1e7ca584c71ee62a638e9334d8d33f27d7cdfc9ae79e4/multidict-6.4.3-py3-none-any.whl", hash = "sha256:59fe01ee8e2a1e8ceb3f6dbb216b09c8d9f4ef1c22c4fc825d045a147fa2ebc9", size = 10400, upload-time = "2025-04-10T22:20:16.445Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mypy"
|
||||
version = "1.19.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "librt" },
|
||||
{ name = "mypy-extensions" },
|
||||
{ name = "pathspec" },
|
||||
{ name = "typing-extensions" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f9/b5/b58cdc25fadd424552804bf410855d52324183112aa004f0732c5f6324cf/mypy-1.19.0.tar.gz", hash = "sha256:f6b874ca77f733222641e5c46e4711648c4037ea13646fd0cdc814c2eaec2528", size = 3579025, upload-time = "2025-11-28T15:49:01.26Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/11/7e/1afa8fb188b876abeaa14460dc4983f909aaacaa4bf5718c00b2c7e0b3d5/mypy-1.19.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:0fb3115cb8fa7c5f887c8a8d81ccdcb94cff334684980d847e5a62e926910e1d", size = 13207728, upload-time = "2025-11-28T15:46:26.463Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b2/13/f103d04962bcbefb1644f5ccb235998b32c337d6c13145ea390b9da47f3e/mypy-1.19.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f3e19e3b897562276bb331074d64c076dbdd3e79213f36eed4e592272dabd760", size = 12202945, upload-time = "2025-11-28T15:48:49.143Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e4/93/a86a5608f74a22284a8ccea8592f6e270b61f95b8588951110ad797c2ddd/mypy-1.19.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b9d491295825182fba01b6ffe2c6fe4e5a49dbf4e2bb4d1217b6ced3b4797bc6", size = 12718673, upload-time = "2025-11-28T15:47:37.193Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/58/cf08fff9ced0423b858f2a7495001fda28dc058136818ee9dffc31534ea9/mypy-1.19.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6016c52ab209919b46169651b362068f632efcd5eb8ef9d1735f6f86da7853b2", size = 13608336, upload-time = "2025-11-28T15:48:32.625Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/64/ed/9c509105c5a6d4b73bb08733102a3ea62c25bc02c51bca85e3134bf912d3/mypy-1.19.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:f188dcf16483b3e59f9278c4ed939ec0254aa8a60e8fc100648d9ab5ee95a431", size = 13833174, upload-time = "2025-11-28T15:45:48.091Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cd/71/01939b66e35c6f8cb3e6fdf0b657f0fd24de2f8ba5e523625c8e72328208/mypy-1.19.0-cp312-cp312-win_amd64.whl", hash = "sha256:0e3c3d1e1d62e678c339e7ade72746a9e0325de42cd2cccc51616c7b2ed1a018", size = 10112208, upload-time = "2025-11-28T15:46:41.702Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/0d/a1357e6bb49e37ce26fcf7e3cc55679ce9f4ebee0cd8b6ee3a0e301a9210/mypy-1.19.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:7686ed65dbabd24d20066f3115018d2dce030d8fa9db01aa9f0a59b6813e9f9e", size = 13191993, upload-time = "2025-11-28T15:47:22.336Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/75/8e5d492a879ec4490e6ba664b5154e48c46c85b5ac9785792a5ec6a4d58f/mypy-1.19.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:fd4a985b2e32f23bead72e2fb4bbe5d6aceee176be471243bd831d5b2644672d", size = 12174411, upload-time = "2025-11-28T15:44:55.492Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/71/31/ad5dcee9bfe226e8eaba777e9d9d251c292650130f0450a280aec3485370/mypy-1.19.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fc51a5b864f73a3a182584b1ac75c404396a17eced54341629d8bdcb644a5bba", size = 12727751, upload-time = "2025-11-28T15:44:14.169Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/06/b6b8994ce07405f6039701f4b66e9d23f499d0b41c6dd46ec28f96d57ec3/mypy-1.19.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:37af5166f9475872034b56c5efdcf65ee25394e9e1d172907b84577120714364", size = 13593323, upload-time = "2025-11-28T15:46:34.699Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/68/b1/126e274484cccdf099a8e328d4fda1c7bdb98a5e888fa6010b00e1bbf330/mypy-1.19.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:510c014b722308c9bd377993bcbf9a07d7e0692e5fa8fc70e639c1eb19fc6bee", size = 13818032, upload-time = "2025-11-28T15:46:18.286Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f8/56/53a8f70f562dfc466c766469133a8a4909f6c0012d83993143f2a9d48d2d/mypy-1.19.0-cp313-cp313-win_amd64.whl", hash = "sha256:cabbee74f29aa9cd3b444ec2f1e4fa5a9d0d746ce7567a6a609e224429781f53", size = 10120644, upload-time = "2025-11-28T15:47:43.99Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b0/f4/7751f32f56916f7f8c229fe902cbdba3e4dd3f3ea9e8b872be97e7fc546d/mypy-1.19.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:f2e36bed3c6d9b5f35d28b63ca4b727cb0228e480826ffc8953d1892ddc8999d", size = 13185236, upload-time = "2025-11-28T15:45:20.696Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/35/31/871a9531f09e78e8d145032355890384f8a5b38c95a2c7732d226b93242e/mypy-1.19.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:a18d8abdda14035c5718acb748faec09571432811af129bf0d9e7b2d6699bf18", size = 12213902, upload-time = "2025-11-28T15:46:10.117Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/58/b8/af221910dd40eeefa2077a59107e611550167b9994693fc5926a0b0f87c0/mypy-1.19.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f75e60aca3723a23511948539b0d7ed514dda194bc3755eae0bfc7a6b4887aa7", size = 12738600, upload-time = "2025-11-28T15:44:22.521Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/11/9f/c39e89a3e319c1d9c734dedec1183b2cc3aefbab066ec611619002abb932/mypy-1.19.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8f44f2ae3c58421ee05fe609160343c25f70e3967f6e32792b5a78006a9d850f", size = 13592639, upload-time = "2025-11-28T15:48:08.55Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/97/6d/ffaf5f01f5e284d9033de1267e6c1b8f3783f2cf784465378a86122e884b/mypy-1.19.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:63ea6a00e4bd6822adbfc75b02ab3653a17c02c4347f5bb0cf1d5b9df3a05835", size = 13799132, upload-time = "2025-11-28T15:47:06.032Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fe/b0/c33921e73aaa0106224e5a34822411bea38046188eb781637f5a5b07e269/mypy-1.19.0-cp314-cp314-win_amd64.whl", hash = "sha256:3ad925b14a0bb99821ff6f734553294aa6a3440a8cb082fe1f5b84dfb662afb1", size = 10269832, upload-time = "2025-11-28T15:47:29.392Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/09/0e/fe228ed5aeab470c6f4eb82481837fadb642a5aa95cc8215fd2214822c10/mypy-1.19.0-py3-none-any.whl", hash = "sha256:0c01c99d626380752e527d5ce8e69ffbba2046eb8a060db0329690849cf9b6f9", size = 2469714, upload-time = "2025-11-28T15:45:33.22Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mypy-extensions"
|
||||
version = "1.1.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a2/6e/371856a3fb9d31ca8dac321cda606860fa4548858c0cc45d9d1d4ca2628b/mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558", size = 6343, upload-time = "2025-04-22T14:54:24.164Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "nh3"
|
||||
version = "0.3.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ca/a5/34c26015d3a434409f4d2a1cd8821a06c05238703f49283ffeb937bef093/nh3-0.3.2.tar.gz", hash = "sha256:f394759a06df8b685a4ebfb1874fb67a9cbfd58c64fc5ed587a663c0e63ec376", size = 19288, upload-time = "2025-10-30T11:17:45.948Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/5b/01/a1eda067c0ba823e5e2bb033864ae4854549e49fb6f3407d2da949106bfb/nh3-0.3.2-cp314-cp314t-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:d18957a90806d943d141cc5e4a0fefa1d77cf0d7a156878bf9a66eed52c9cc7d", size = 1419839, upload-time = "2025-10-30T11:17:09.956Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/30/57/07826ff65d59e7e9cc789ef1dc405f660cabd7458a1864ab58aefa17411b/nh3-0.3.2-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45c953e57028c31d473d6b648552d9cab1efe20a42ad139d78e11d8f42a36130", size = 791183, upload-time = "2025-10-30T11:17:11.99Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/af/2f/e8a86f861ad83f3bb5455f596d5c802e34fcdb8c53a489083a70fd301333/nh3-0.3.2-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2c9850041b77a9147d6bbd6dbbf13eeec7009eb60b44e83f07fcb2910075bf9b", size = 829127, upload-time = "2025-10-30T11:17:13.192Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d8/97/77aef4daf0479754e8e90c7f8f48f3b7b8725a3b8c0df45f2258017a6895/nh3-0.3.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:403c11563e50b915d0efdb622866d1d9e4506bce590ef7da57789bf71dd148b5", size = 997131, upload-time = "2025-10-30T11:17:14.677Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/41/ee/fd8140e4df9d52143e89951dd0d797f5546004c6043285289fbbe3112293/nh3-0.3.2-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:0dca4365db62b2d71ff1620ee4f800c4729849906c5dd504ee1a7b2389558e31", size = 1068783, upload-time = "2025-10-30T11:17:15.861Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/87/64/bdd9631779e2d588b08391f7555828f352e7f6427889daf2fa424bfc90c9/nh3-0.3.2-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:0fe7ee035dd7b2290715baf29cb27167dddd2ff70ea7d052c958dbd80d323c99", size = 994732, upload-time = "2025-10-30T11:17:17.155Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/79/66/90190033654f1f28ca98e3d76b8be1194505583f9426b0dcde782a3970a2/nh3-0.3.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:a40202fd58e49129764f025bbaae77028e420f1d5b3c8e6f6fd3a6490d513868", size = 975997, upload-time = "2025-10-30T11:17:18.77Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/34/30/ebf8e2e8d71fdb5a5d5d8836207177aed1682df819cbde7f42f16898946c/nh3-0.3.2-cp314-cp314t-win32.whl", hash = "sha256:1f9ba555a797dbdcd844b89523f29cdc90973d8bd2e836ea6b962cf567cadd93", size = 583364, upload-time = "2025-10-30T11:17:20.286Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/ae/95c52b5a75da429f11ca8902c2128f64daafdc77758d370e4cc310ecda55/nh3-0.3.2-cp314-cp314t-win_amd64.whl", hash = "sha256:dce4248edc427c9b79261f3e6e2b3ecbdd9b88c267012168b4a7b3fc6fd41d13", size = 589982, upload-time = "2025-10-30T11:17:21.384Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/bd/c7d862a4381b95f2469704de32c0ad419def0f4a84b7a138a79532238114/nh3-0.3.2-cp314-cp314t-win_arm64.whl", hash = "sha256:019ecbd007536b67fdf76fab411b648fb64e2257ca3262ec80c3425c24028c80", size = 577126, upload-time = "2025-10-30T11:17:22.755Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b6/3e/f5a5cc2885c24be13e9b937441bd16a012ac34a657fe05e58927e8af8b7a/nh3-0.3.2-cp38-abi3-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:7064ccf5ace75825bd7bf57859daaaf16ed28660c1c6b306b649a9eda4b54b1e", size = 1431980, upload-time = "2025-10-30T11:17:25.457Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7f/f7/529a99324d7ef055de88b690858f4189379708abae92ace799365a797b7f/nh3-0.3.2-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c8745454cdd28bbbc90861b80a0111a195b0e3961b9fa2e672be89eb199fa5d8", size = 820805, upload-time = "2025-10-30T11:17:26.98Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/62/19b7c50ccd1fa7d0764822d2cea8f2a320f2fd77474c7a1805cb22cf69b0/nh3-0.3.2-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:72d67c25a84579f4a432c065e8b4274e53b7cf1df8f792cf846abfe2c3090866", size = 803527, upload-time = "2025-10-30T11:17:28.284Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4a/ca/f022273bab5440abff6302731a49410c5ef66b1a9502ba3fbb2df998d9ff/nh3-0.3.2-cp38-abi3-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:13398e676a14d6233f372c75f52d5ae74f98210172991f7a3142a736bd92b131", size = 1051674, upload-time = "2025-10-30T11:17:29.909Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fa/f7/5728e3b32a11daf5bd21cf71d91c463f74305938bc3eb9e0ac1ce141646e/nh3-0.3.2-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03d617e5c8aa7331bd2659c654e021caf9bba704b109e7b2b28b039a00949fe5", size = 1004737, upload-time = "2025-10-30T11:17:31.205Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/53/7f/f17e0dba0a99cee29e6cee6d4d52340ef9cb1f8a06946d3a01eb7ec2fb01/nh3-0.3.2-cp38-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f2f55c4d2d5a207e74eefe4d828067bbb01300e06e2a7436142f915c5928de07", size = 911745, upload-time = "2025-10-30T11:17:32.945Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/42/0f/c76bf3dba22c73c38e9b1113b017cf163f7696f50e003404ec5ecdb1e8a6/nh3-0.3.2-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7bb18403f02b655a1bbe4e3a4696c2ae1d6ae8f5991f7cacb684b1ae27e6c9f7", size = 797184, upload-time = "2025-10-30T11:17:34.226Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/08/a1/73d8250f888fb0ddf1b119b139c382f8903d8bb0c5bd1f64afc7e38dad1d/nh3-0.3.2-cp38-abi3-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6d66f41672eb4060cf87c037f760bdbc6847852ca9ef8e9c5a5da18f090abf87", size = 838556, upload-time = "2025-10-30T11:17:35.875Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/09/deb57f1fb656a7a5192497f4a287b0ade5a2ff6b5d5de4736d13ef6d2c1f/nh3-0.3.2-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:f97f8b25cb2681d25e2338148159447e4d689aafdccfcf19e61ff7db3905768a", size = 1006695, upload-time = "2025-10-30T11:17:37.071Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b6/61/8f4d41c4ccdac30e4b1a4fa7be4b0f9914d8314a5058472f84c8e101a418/nh3-0.3.2-cp38-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:2ab70e8c6c7d2ce953d2a58102eefa90c2d0a5ed7aa40c7e29a487bc5e613131", size = 1075471, upload-time = "2025-10-30T11:17:38.225Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b0/c6/966aec0cb4705e69f6c3580422c239205d5d4d0e50fac380b21e87b6cf1b/nh3-0.3.2-cp38-abi3-musllinux_1_2_i686.whl", hash = "sha256:1710f3901cd6440ca92494ba2eb6dc260f829fa8d9196b659fa10de825610ce0", size = 1002439, upload-time = "2025-10-30T11:17:39.553Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e2/c8/97a2d5f7a314cce2c5c49f30c6f161b7f3617960ade4bfc2fd1ee092cb20/nh3-0.3.2-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:91e9b001101fb4500a2aafe3e7c92928d85242d38bf5ac0aba0b7480da0a4cd6", size = 987439, upload-time = "2025-10-30T11:17:40.81Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/95/2d6fc6461687d7a171f087995247dec33e8749a562bfadd85fb5dbf37a11/nh3-0.3.2-cp38-abi3-win32.whl", hash = "sha256:169db03df90da63286e0560ea0efa9b6f3b59844a9735514a1d47e6bb2c8c61b", size = 589826, upload-time = "2025-10-30T11:17:42.239Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/64/9a/1a1c154f10a575d20dd634e5697805e589bbdb7673a0ad00e8da90044ba7/nh3-0.3.2-cp38-abi3-win_amd64.whl", hash = "sha256:562da3dca7a17f9077593214a9781a94b8d76de4f158f8c895e62f09573945fe", size = 596406, upload-time = "2025-10-30T11:17:43.773Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/7e/a96255f63b7aef032cbee8fc4d6e37def72e3aaedc1f72759235e8f13cb1/nh3-0.3.2-cp38-abi3-win_arm64.whl", hash = "sha256:cf5964d54edd405e68583114a7cba929468bcd7db5e676ae38ee954de1cfc104", size = 584162, upload-time = "2025-10-30T11:17:44.96Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "nodeenv"
|
||||
version = "1.9.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437, upload-time = "2024-06-04T18:44:11.171Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314, upload-time = "2024-06-04T18:44:08.352Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "numpy"
|
||||
version = "2.2.5"
|
||||
@@ -1064,6 +1452,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ab/5f/b38085618b950b79d2d9164a711c52b10aefc0ae6833b96f626b7021b2ed/pandas-2.2.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:ad5b65698ab28ed8d7f18790a0dc58005c7629f227be9ecc1072aa74c0c1d43a", size = 13098436, upload-time = "2024-09-20T13:09:48.112Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pathspec"
|
||||
version = "0.12.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ca/bc/f35b8446f4531a7cb215605d100cd88b7ac6f44ab3fc94870c120ab3adbf/pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712", size = 51043, upload-time = "2023-12-10T22:30:45Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/20/ff623b09d963f88bfde16306a54e12ee5ea43e9b597108672ff3a408aad6/pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08", size = 31191, upload-time = "2023-12-10T22:30:43.14Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pdfminer-six"
|
||||
version = "20250506"
|
||||
@@ -1127,6 +1524,31 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/fe/39/979e8e21520d4e47a0bbe349e2713c0aac6f3d853d0e5b34d76206c439aa/platformdirs-4.3.8-py3-none-any.whl", hash = "sha256:ff7059bb7eb1179e2685604f4aaf157cfd9535242bd23742eadc3c13542139b4", size = 18567, upload-time = "2025-05-07T22:47:40.376Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pluggy"
|
||||
version = "1.6.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pre-commit"
|
||||
version = "4.5.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "cfgv" },
|
||||
{ name = "identify" },
|
||||
{ name = "nodeenv" },
|
||||
{ name = "pyyaml" },
|
||||
{ name = "virtualenv" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f4/9b/6a4ffb4ed980519da959e1cf3122fc6cb41211daa58dbae1c73c0e519a37/pre_commit-4.5.0.tar.gz", hash = "sha256:dc5a065e932b19fc1d4c653c6939068fe54325af8e741e74e88db4d28a4dd66b", size = 198428, upload-time = "2025-11-22T21:02:42.304Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/c4/b2d28e9d2edf4f1713eb3c29307f1a63f3d67cf09bdda29715a36a68921a/pre_commit-4.5.0-py2.py3-none-any.whl", hash = "sha256:25e2ce09595174d9c97860a95609f9f852c0614ba602de3561e267547f2335e1", size = 226429, upload-time = "2025-11-22T21:02:40.836Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "propcache"
|
||||
version = "0.3.1"
|
||||
@@ -1296,6 +1718,15 @@ crypto = [
|
||||
{ name = "cryptography" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyproject-hooks"
|
||||
version = "1.2.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/e7/82/28175b2414effca1cdac8dc99f76d660e7a4fb0ceefa4b4ab8f5f6742925/pyproject_hooks-1.2.0.tar.gz", hash = "sha256:1e859bd5c40fae9448642dd871adf459e5e2084186e8d2c2a79a824c970da1f8", size = 19228, upload-time = "2024-09-29T09:24:13.293Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/24/12818598c362d7f300f18e74db45963dbcb85150324092410c8b49405e42/pyproject_hooks-1.2.0-py3-none-any.whl", hash = "sha256:9e5c6bfa8dcc30091c74b0cf803c81fdd29d94f01992a7707bc97babb1141913", size = 10216, upload-time = "2024-09-29T09:24:11.978Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyreadline3"
|
||||
version = "3.5.4"
|
||||
@@ -1305,6 +1736,49 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/5a/dc/491b7661614ab97483abf2056be1deee4dc2490ecbf7bff9ab5cdbac86e1/pyreadline3-3.5.4-py3-none-any.whl", hash = "sha256:eaf8e6cc3c49bcccf145fc6067ba8643d1df34d604a1ec0eccbf7a18e6d3fae6", size = 83178, upload-time = "2024-09-19T02:40:08.598Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pytest"
|
||||
version = "9.0.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||
{ name = "iniconfig" },
|
||||
{ name = "packaging" },
|
||||
{ name = "pluggy" },
|
||||
{ name = "pygments" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/d1/db/7ef3487e0fb0049ddb5ce41d3a49c235bf9ad299b6a25d5780a89f19230f/pytest-9.0.2.tar.gz", hash = "sha256:75186651a92bd89611d1d9fc20f0b4345fd827c41ccd5c299a868a05d70edf11", size = 1568901, upload-time = "2025-12-06T21:30:51.014Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/ab/b3226f0bd7cdcf710fbede2b3548584366da3b19b5021e74f5bde2a8fa3f/pytest-9.0.2-py3-none-any.whl", hash = "sha256:711ffd45bf766d5264d487b917733b453d917afd2b0ad65223959f59089f875b", size = 374801, upload-time = "2025-12-06T21:30:49.154Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pytest-asyncio"
|
||||
version = "1.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pytest" },
|
||||
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/90/2c/8af215c0f776415f3590cac4f9086ccefd6fd463befeae41cd4d3f193e5a/pytest_asyncio-1.3.0.tar.gz", hash = "sha256:d7f52f36d231b80ee124cd216ffb19369aa168fc10095013c6b014a34d3ee9e5", size = 50087, upload-time = "2025-11-10T16:07:47.256Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/35/f8b19922b6a25bc0880171a2f1a003eaeb93657475193ab516fd87cac9da/pytest_asyncio-1.3.0-py3-none-any.whl", hash = "sha256:611e26147c7f77640e6d0a92a38ed17c3e9848063698d5c93d5aa7aa11cebff5", size = 15075, upload-time = "2025-11-10T16:07:45.537Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pytest-cov"
|
||||
version = "7.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "coverage" },
|
||||
{ name = "pluggy" },
|
||||
{ name = "pytest" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/5e/f7/c933acc76f5208b3b00089573cf6a2bc26dc80a8aece8f52bb7d6b1855ca/pytest_cov-7.0.0.tar.gz", hash = "sha256:33c97eda2e049a0c5298e91f519302a1334c26ac65c1a483d6206fd458361af1", size = 54328, upload-time = "2025-09-09T10:57:02.113Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/49/1377b49de7d0c1ce41292161ea0f721913fa8722c19fb9c1e3aa0367eecb/pytest_cov-7.0.0-py3-none-any.whl", hash = "sha256:3b8e9558b16cc1479da72058bdecf8073661c7f57f7d3c5f22a1c23507f2d861", size = 22424, upload-time = "2025-09-09T10:57:00.695Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "python-dateutil"
|
||||
version = "2.9.0.post0"
|
||||
@@ -1355,17 +1829,101 @@ wheels = [
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pytz"
|
||||
version = "2025.2"
|
||||
name = "pytokens"
|
||||
version = "0.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f8/bf/abbd3cdfb8fbc7fb3d4d38d320f2441b1e7cbe29be4f23797b4a2b5d8aac/pytz-2025.2.tar.gz", hash = "sha256:360b9e3dbb49a209c21ad61809c7fb453643e048b38924c765813546746e81c3", size = 320884, upload-time = "2025-03-25T02:25:00.538Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/4e/8d/a762be14dae1c3bf280202ba3172020b2b0b4c537f94427435f19c413b72/pytokens-0.3.0.tar.gz", hash = "sha256:2f932b14ed08de5fcf0b391ace2642f858f1394c0857202959000b68ed7a458a", size = 17644, upload-time = "2025-11-05T13:36:35.34Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/81/c4/34e93fe5f5429d7570ec1fa436f1986fb1f00c3e0f43a589fe2bbcd22c3f/pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00", size = 509225, upload-time = "2025-03-25T02:24:58.468Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/84/25/d9db8be44e205a124f6c98bc0324b2bb149b7431c53877fc6d1038dddaf5/pytokens-0.3.0-py3-none-any.whl", hash = "sha256:95b2b5eaf832e469d141a378872480ede3f251a5a5041b8ec6e581d3ac71bbf3", size = 12195, upload-time = "2025-11-05T13:36:33.183Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pytz"
|
||||
version = "2021.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b0/61/eddc6eb2c682ea6fd97a7e1018a6294be80dba08fa28e7a3570148b4612d/pytz-2021.1.tar.gz", hash = "sha256:83a4a90894bf38e243cf052c8b58f381bfe9a7a483f6a9cab140bc7f702ac4da", size = 317945, upload-time = "2021-02-01T08:07:19.773Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/70/94/784178ca5dd892a98f113cdd923372024dc04b8d40abe77ca76b5fb90ca6/pytz-2021.1-py2.py3-none-any.whl", hash = "sha256:eb10ce3e7736052ed3623d49975ce333bcd712c7bb19a58b9e2089d4057d0798", size = 510782, upload-time = "2021-02-01T08:07:15.659Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pywin32-ctypes"
|
||||
version = "0.2.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/85/9f/01a1a99704853cb63f253eea009390c88e7131c67e66a0a02099a8c917cb/pywin32-ctypes-0.2.3.tar.gz", hash = "sha256:d162dc04946d704503b2edc4d55f3dba5c1d539ead017afa00142c38b9885755", size = 29471, upload-time = "2024-08-14T10:15:34.626Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/de/3d/8161f7711c017e01ac9f008dfddd9410dff3674334c233bde66e7ba65bbf/pywin32_ctypes-0.2.3-py3-none-any.whl", hash = "sha256:8a1513379d709975552d202d942d9837758905c8d01eb82b8bcc30918929e7b8", size = 30756, upload-time = "2024-08-14T10:15:33.187Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyyaml"
|
||||
version = "6.0.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/05/8e/961c0007c59b8dd7729d542c61a4d537767a59645b82a0b521206e1e25c2/pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f", size = 130960, upload-time = "2025-09-25T21:33:16.546Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/33/422b98d2195232ca1826284a76852ad5a86fe23e31b009c9886b2d0fb8b2/pyyaml-6.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196", size = 182063, upload-time = "2025-09-25T21:32:11.445Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/89/a0/6cf41a19a1f2f3feab0e9c0b74134aa2ce6849093d5517a0c550fe37a648/pyyaml-6.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0", size = 173973, upload-time = "2025-09-25T21:32:12.492Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ed/23/7a778b6bd0b9a8039df8b1b1d80e2e2ad78aa04171592c8a5c43a56a6af4/pyyaml-6.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28", size = 775116, upload-time = "2025-09-25T21:32:13.652Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/65/30/d7353c338e12baef4ecc1b09e877c1970bd3382789c159b4f89d6a70dc09/pyyaml-6.0.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c", size = 844011, upload-time = "2025-09-25T21:32:15.21Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/9d/b3589d3877982d4f2329302ef98a8026e7f4443c765c46cfecc8858c6b4b/pyyaml-6.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc", size = 807870, upload-time = "2025-09-25T21:32:16.431Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/05/c0/b3be26a015601b822b97d9149ff8cb5ead58c66f981e04fedf4e762f4bd4/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e", size = 761089, upload-time = "2025-09-25T21:32:17.56Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/8e/98435a21d1d4b46590d5459a22d88128103f8da4c2d4cb8f14f2a96504e1/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea", size = 790181, upload-time = "2025-09-25T21:32:18.834Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/93/7baea19427dcfbe1e5a372d81473250b379f04b1bd3c4c5ff825e2327202/pyyaml-6.0.3-cp312-cp312-win32.whl", hash = "sha256:96b533f0e99f6579b3d4d4995707cf36df9100d67e0c8303a0c55b27b5f99bc5", size = 137658, upload-time = "2025-09-25T21:32:20.209Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/bf/899e81e4cce32febab4fb42bb97dcdf66bc135272882d1987881a4b519e9/pyyaml-6.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:5fcd34e47f6e0b794d17de1b4ff496c00986e1c83f7ab2fb8fcfe9616ff7477b", size = 154003, upload-time = "2025-09-25T21:32:21.167Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1a/08/67bd04656199bbb51dbed1439b7f27601dfb576fb864099c7ef0c3e55531/pyyaml-6.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:64386e5e707d03a7e172c0701abfb7e10f0fb753ee1d773128192742712a98fd", size = 140344, upload-time = "2025-09-25T21:32:22.617Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/11/0fd08f8192109f7169db964b5707a2f1e8b745d4e239b784a5a1dd80d1db/pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8", size = 181669, upload-time = "2025-09-25T21:32:23.673Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/16/95309993f1d3748cd644e02e38b75d50cbc0d9561d21f390a76242ce073f/pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1", size = 173252, upload-time = "2025-09-25T21:32:25.149Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/31/b20f376d3f810b9b2371e72ef5adb33879b25edb7a6d072cb7ca0c486398/pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c", size = 767081, upload-time = "2025-09-25T21:32:26.575Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/49/1e/a55ca81e949270d5d4432fbbd19dfea5321eda7c41a849d443dc92fd1ff7/pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5", size = 841159, upload-time = "2025-09-25T21:32:27.727Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/27/e5b8f34d02d9995b80abcef563ea1f8b56d20134d8f4e5e81733b1feceb2/pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6", size = 801626, upload-time = "2025-09-25T21:32:28.878Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f9/11/ba845c23988798f40e52ba45f34849aa8a1f2d4af4b798588010792ebad6/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6", size = 753613, upload-time = "2025-09-25T21:32:30.178Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/e0/7966e1a7bfc0a45bf0a7fb6b98ea03fc9b8d84fa7f2229e9659680b69ee3/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be", size = 794115, upload-time = "2025-09-25T21:32:31.353Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/de/94/980b50a6531b3019e45ddeada0626d45fa85cbe22300844a7983285bed3b/pyyaml-6.0.3-cp313-cp313-win32.whl", hash = "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26", size = 137427, upload-time = "2025-09-25T21:32:32.58Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/97/c9/39d5b874e8b28845e4ec2202b5da735d0199dbe5b8fb85f91398814a9a46/pyyaml-6.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c", size = 154090, upload-time = "2025-09-25T21:32:33.659Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/73/e8/2bdf3ca2090f68bb3d75b44da7bbc71843b19c9f2b9cb9b0f4ab7a5a4329/pyyaml-6.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb", size = 140246, upload-time = "2025-09-25T21:32:34.663Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9d/8c/f4bd7f6465179953d3ac9bc44ac1a8a3e6122cf8ada906b4f96c60172d43/pyyaml-6.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac", size = 181814, upload-time = "2025-09-25T21:32:35.712Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/9c/4d95bb87eb2063d20db7b60faa3840c1b18025517ae857371c4dd55a6b3a/pyyaml-6.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310", size = 173809, upload-time = "2025-09-25T21:32:36.789Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/92/b5/47e807c2623074914e29dabd16cbbdd4bf5e9b2db9f8090fa64411fc5382/pyyaml-6.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7", size = 766454, upload-time = "2025-09-25T21:32:37.966Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/02/9e/e5e9b168be58564121efb3de6859c452fccde0ab093d8438905899a3a483/pyyaml-6.0.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788", size = 836355, upload-time = "2025-09-25T21:32:39.178Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/88/f9/16491d7ed2a919954993e48aa941b200f38040928474c9e85ea9e64222c3/pyyaml-6.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5", size = 794175, upload-time = "2025-09-25T21:32:40.865Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dd/3f/5989debef34dc6397317802b527dbbafb2b4760878a53d4166579111411e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764", size = 755228, upload-time = "2025-09-25T21:32:42.084Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d7/ce/af88a49043cd2e265be63d083fc75b27b6ed062f5f9fd6cdc223ad62f03e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35", size = 789194, upload-time = "2025-09-25T21:32:43.362Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/23/20/bb6982b26a40bb43951265ba29d4c246ef0ff59c9fdcdf0ed04e0687de4d/pyyaml-6.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:4a2e8cebe2ff6ab7d1050ecd59c25d4c8bd7e6f400f5f82b96557ac0abafd0ac", size = 156429, upload-time = "2025-09-25T21:32:57.844Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/f4/a4541072bb9422c8a883ab55255f918fa378ecf083f5b85e87fc2b4eda1b/pyyaml-6.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:93dda82c9c22deb0a405ea4dc5f2d0cda384168e466364dec6255b293923b2f3", size = 143912, upload-time = "2025-09-25T21:32:59.247Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/f9/07dd09ae774e4616edf6cda684ee78f97777bdd15847253637a6f052a62f/pyyaml-6.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3", size = 189108, upload-time = "2025-09-25T21:32:44.377Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4e/78/8d08c9fb7ce09ad8c38ad533c1191cf27f7ae1effe5bb9400a46d9437fcf/pyyaml-6.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba", size = 183641, upload-time = "2025-09-25T21:32:45.407Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/5b/3babb19104a46945cf816d047db2788bcaf8c94527a805610b0289a01c6b/pyyaml-6.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c", size = 831901, upload-time = "2025-09-25T21:32:48.83Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/cc/dff0684d8dc44da4d22a13f35f073d558c268780ce3c6ba1b87055bb0b87/pyyaml-6.0.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702", size = 861132, upload-time = "2025-09-25T21:32:50.149Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/5e/f77dc6b9036943e285ba76b49e118d9ea929885becb0a29ba8a7c75e29fe/pyyaml-6.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c", size = 839261, upload-time = "2025-09-25T21:32:51.808Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ce/88/a9db1376aa2a228197c58b37302f284b5617f56a5d959fd1763fb1675ce6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065", size = 805272, upload-time = "2025-09-25T21:32:52.941Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/92/1446574745d74df0c92e6aa4a7b0b3130706a4142b2d1a5869f2eaa423c6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65", size = 829923, upload-time = "2025-09-25T21:32:54.537Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f0/7a/1c7270340330e575b92f397352af856a8c06f230aa3e76f86b39d01b416a/pyyaml-6.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4ad1906908f2f5ae4e5a8ddfce73c320c2a1429ec52eafd27138b7f1cbe341c9", size = 174062, upload-time = "2025-09-25T21:32:55.767Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f1/12/de94a39c2ef588c7e6455cfbe7343d3b2dc9d6b6b2f40c4c6565744c873d/pyyaml-6.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:ebc55a14a21cb14062aa4162f906cd962b28e2e9ea38f9b4391244cd8de4ae0b", size = 149341, upload-time = "2025-09-25T21:32:56.828Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "readme-renderer"
|
||||
version = "44.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "docutils" },
|
||||
{ name = "nh3" },
|
||||
{ name = "pygments" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/5a/a9/104ec9234c8448c4379768221ea6df01260cd6c2ce13182d4eac531c8342/readme_renderer-44.0.tar.gz", hash = "sha256:8712034eabbfa6805cacf1402b4eeb2a73028f72d1166d6f5cb7f9c047c5d1e1", size = 32056, upload-time = "2024-07-08T15:00:57.805Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/e1/67/921ec3024056483db83953ae8e48079ad62b92db7880013ca77632921dd0/readme_renderer-44.0-py3-none-any.whl", hash = "sha256:2fbca89b81a08526aadf1357a8c2ae889ec05fb03f5da67f9769c9a592166151", size = 13310, upload-time = "2024-07-08T15:00:56.577Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "regex"
|
||||
version = "2021.4.4"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/38/3f/4c42a98c9ad7d08c16e7d23b2194a0e4f3b2914662da8bc88986e4e6de1f/regex-2021.4.4.tar.gz", hash = "sha256:52ba3d3f9b942c49d7e4bc105bb28551c44065f139a65062ab7912bef10c9afb", size = 693187, upload-time = "2021-04-04T16:50:49.77Z" }
|
||||
|
||||
[[package]]
|
||||
name = "requests"
|
||||
version = "2.32.3"
|
||||
version = "2.32.5"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "certifi" },
|
||||
@@ -1373,9 +1931,30 @@ dependencies = [
|
||||
{ name = "idna" },
|
||||
{ name = "urllib3" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218, upload-time = "2024-05-29T15:37:49.536Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928, upload-time = "2024-05-29T15:37:47.027Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "requests-toolbelt"
|
||||
version = "1.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "requests" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f3/61/d7545dafb7ac2230c70d38d31cbfe4cc64f7144dc41f6e4e4b78ecd9f5bb/requests-toolbelt-1.0.0.tar.gz", hash = "sha256:7681a0a3d047012b5bdc0ee37d7f8f07ebe76ab08caeccfc3921ce23c88d5bc6", size = 206888, upload-time = "2023-05-01T04:11:33.229Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/3f/51/d4db610ef29373b879047326cbf6fa98b6c1969d6f6dc423279de2b1be2c/requests_toolbelt-1.0.0-py2.py3-none-any.whl", hash = "sha256:cccfdd665f0a24fcf4726e690f65639d272bb0637b9b92dfd91a5568ccf6bd06", size = 54481, upload-time = "2023-05-01T04:11:28.427Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "rfc3986"
|
||||
version = "2.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/85/40/1520d68bfa07ab5a6f065a186815fb6610c86fe957bc065754e47f7b0840/rfc3986-2.0.0.tar.gz", hash = "sha256:97aacf9dbd4bfd829baad6e6309fa6573aaf1be3f6fa735c8ab05e46cecb261c", size = 49026, upload-time = "2022-01-10T00:52:30.832Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ff/9a/9afaade874b2fa6c752c36f1548f718b5b83af81ed9b76628329dab81c1b/rfc3986-2.0.0-py2.py3-none-any.whl", hash = "sha256:50b1502b60e289cb37883f3dfd34532b8873c7de9f49bb546641ce9cbd256ebd", size = 31326, upload-time = "2022-01-10T00:52:29.594Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -1416,6 +1995,19 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cd/be/f6b790d6ae98f1f32c645f8540d5c96248b72343b0a56fab3a07f2941897/ruff-0.11.8-py3-none-win_arm64.whl", hash = "sha256:304432e4c4a792e3da85b7699feb3426a0908ab98bf29df22a31b0cdd098fac2", size = 10713129, upload-time = "2025-05-01T14:53:22.27Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "secretstorage"
|
||||
version = "3.5.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "cryptography" },
|
||||
{ name = "jeepney" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/1c/03/e834bcd866f2f8a49a85eaff47340affa3bfa391ee9912a952a1faa68c7b/secretstorage-3.5.0.tar.gz", hash = "sha256:f04b8e4689cbce351744d5537bf6b1329c6fc68f91fa666f60a380edddcd11be", size = 19884, upload-time = "2025-11-23T19:02:53.191Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/46/f5af3402b579fd5e11573ce652019a67074317e18c1935cc0b4ba9b35552/secretstorage-3.5.0-py3-none-any.whl", hash = "sha256:0ce65888c0725fcb2c5bc0fdb8e5438eece02c523557ea40ce0703c266248137", size = 15554, upload-time = "2025-11-23T19:02:51.545Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "six"
|
||||
version = "1.17.0"
|
||||
@@ -1519,6 +2111,21 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/81/a0685932473a7a626bd4d27c73f0b8593881391b68ac2fe6f1dc69037c4b/textual_image-0.8.2-py3-none-any.whl", hash = "sha256:35ab95076d2edcd9e59d66e1881bf177ab8acd7f131446a129f55cae9c81c447", size = 109372, upload-time = "2025-04-01T19:39:36.93Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ticktick-py"
|
||||
version = "2.0.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pytz" },
|
||||
{ name = "regex" },
|
||||
{ name = "requests" },
|
||||
{ name = "urllib3" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/65/15/562e6ea29d39ce1cf7943c7819bab4e4a0811bc0bab2c03a8fc2731b2038/ticktick-py-2.0.1.tar.gz", hash = "sha256:4433ac15d1e2540827f225a6db4669b6242db6d3d882636aa0acdf2792985101", size = 45018, upload-time = "2021-06-24T20:24:08.659Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/de/cb/6291e38d14a52c73a4bf62a5cde88855741c1294f4a68cf38b46861d8480/ticktick_py-2.0.1-py3-none-any.whl", hash = "sha256:676c603322010ba9e508eda71698e917a3e2ba472bcfd26be2e5db198455fda5", size = 45675, upload-time = "2021-06-24T20:24:07.355Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tqdm"
|
||||
version = "4.67.1"
|
||||
@@ -1531,6 +2138,26 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl", hash = "sha256:26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2", size = 78540, upload-time = "2024-11-24T20:12:19.698Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "twine"
|
||||
version = "6.2.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "id" },
|
||||
{ name = "keyring", marker = "platform_machine != 'ppc64le' and platform_machine != 's390x'" },
|
||||
{ name = "packaging" },
|
||||
{ name = "readme-renderer" },
|
||||
{ name = "requests" },
|
||||
{ name = "requests-toolbelt" },
|
||||
{ name = "rfc3986" },
|
||||
{ name = "rich" },
|
||||
{ name = "urllib3" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/e0/a8/949edebe3a82774c1ec34f637f5dd82d1cf22c25e963b7d63771083bbee5/twine-6.2.0.tar.gz", hash = "sha256:e5ed0d2fd70c9959770dce51c8f39c8945c574e18173a7b81802dab51b4b75cf", size = 172262, upload-time = "2025-09-04T15:43:17.255Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/7a/882d99539b19b1490cac5d77c67338d126e4122c8276bf640e411650c830/twine-6.2.0-py3-none-any.whl", hash = "sha256:418ebf08ccda9a8caaebe414433b0ba5e25eb5e4a927667122fbe8f829f985d8", size = 42727, upload-time = "2025-09-04T15:43:15.994Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "typing-extensions"
|
||||
version = "4.13.2"
|
||||
@@ -1572,11 +2199,25 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "urllib3"
|
||||
version = "2.4.0"
|
||||
version = "1.26.7"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/8a/78/16493d9c386d8e60e442a35feac5e00f0913c0f4b7c217c11e8ec2ff53e0/urllib3-2.4.0.tar.gz", hash = "sha256:414bc6535b787febd7567804cc015fee39daab8ad86268f1310a9250697de466", size = 390672, upload-time = "2025-04-10T15:23:39.232Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/80/be/3ee43b6c5757cabea19e75b8f46eaf05a2f5144107d7db48c7cf3a864f73/urllib3-1.26.7.tar.gz", hash = "sha256:4987c65554f7a2dbf30c18fd48778ef124af6fab771a377103da0585e2336ece", size = 291350, upload-time = "2021-09-22T18:01:18.331Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/6b/11/cc635220681e93a0183390e26485430ca2c7b5f9d33b15c74c2861cb8091/urllib3-2.4.0-py3-none-any.whl", hash = "sha256:4e16665048960a0900c702d4a66415956a584919c03361cac9f1df5c5dd7e813", size = 128680, upload-time = "2025-04-10T15:23:37.377Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/af/f4/524415c0744552cce7d8bf3669af78e8a069514405ea4fcbd0cc44733744/urllib3-1.26.7-py2.py3-none-any.whl", hash = "sha256:c4fdf4019605b6e5423637e01bc9fe4daef873709a7973e195ceba0a62bbc844", size = 138764, upload-time = "2021-09-22T18:01:15.93Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "virtualenv"
|
||||
version = "20.35.4"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "distlib" },
|
||||
{ name = "filelock" },
|
||||
{ name = "platformdirs" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/20/28/e6f1a6f655d620846bd9df527390ecc26b3805a0c5989048c210e22c5ca9/virtualenv-20.35.4.tar.gz", hash = "sha256:643d3914d73d3eeb0c552cbb12d7e82adf0e504dbf86a3182f8771a153a1971c", size = 6028799, upload-time = "2025-10-29T06:57:40.511Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/79/0c/c05523fa3181fdf0c9c52a6ba91a23fbf3246cc095f26f6516f9c60e6771/virtualenv-20.35.4-py3-none-any.whl", hash = "sha256:c21c9cede36c9753eeade68ba7d523529f228a403463376cf821eaae2b650f1b", size = 6005095, upload-time = "2025-10-29T06:57:37.598Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
||||
Reference in New Issue
Block a user