Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- import json
- import requests
- def get_all_comments(video_id, api_key):
- comments = []
- url = f"https://www.googleapis.com/youtube/v3/commentThreads?part=snippet&videoId={video_id}&key={api_key}&maxResults=100"
- while url:
- response = requests.get(url)
- data = response.json()
- for item in data["items"]:
- comment = {
- "text": item["snippet"]["topLevelComment"]["snippet"]["textDisplay"],
- "likeCount": item["snippet"]["topLevelComment"]["snippet"]["likeCount"],
- "publishedAt": item["snippet"]["topLevelComment"]["snippet"][
- "publishedAt"
- ],
- "authorDisplayName": item["snippet"]["topLevelComment"]["snippet"][
- "authorDisplayName"
- ],
- "replies": [],
- }
- if "replies" in item:
- for reply in item["replies"]["comments"]:
- comment["replies"].append(
- {
- "text": reply["snippet"]["textDisplay"],
- "likeCount": reply["snippet"]["likeCount"],
- "publishedAt": reply["snippet"]["publishedAt"],
- "authorDisplayName": reply["snippet"]["authorDisplayName"],
- }
- )
- comments.append(comment)
- url = data.get("nextPageToken")
- if url:
- url = f"https://www.googleapis.com/youtube/v3/commentThreads?part=snippet&videoId={video_id}&key={api_key}&maxResults=100&pageToken={url}"
- return comments
- def save_comments_to_file(comments, filename="comments.json"):
- with open(filename, "w", encoding="utf-8") as f:
- json.dump(comments, f, ensure_ascii=False, indent=4)
- print(f"Comments have been saved to {filename}")
- def generate_text_file(comments, filename="comments.txt"):
- with open(filename, "w", encoding="utf-8") as f:
- for comment in comments:
- f.write(f"Comment: {comment['text']}\n")
- f.write(f"Likes: {comment['likeCount']}\n")
- f.write(f"Date: {comment['publishedAt']}\n")
- f.write(f"Author: {comment['authorDisplayName']}\n")
- if comment["replies"]:
- f.write(f"Replies ({len(comment['replies'])}):\n")
- for reply in comment["replies"]:
- f.write(f"- Reply: {reply['text']}\n")
- f.write("\n---\n\n")
- print(f"Comments have been saved to {filename}")
- # Example usage
- if __name__ == "__main__":
- # Replace these with your actual values
- VIDEO_ID = "YouTube movie ID" # ID filmu np. X-i1oBrh2LI
- API_KEY = "YouTube_API_KEY" # <-------- miejsce na YouTube API KEY
- # Get all comments
- comments = get_all_comments(VIDEO_ID, API_KEY)
- # Save to file
- save_comments_to_file(comments)
- # Generate text file
- generate_text_file(comments)
- # Print summary
- print(f"Total comments fetched: {len(comments)}")
- total_replies = sum(len(comment["replies"]) for comment in comments)
- print(f"Total replies fetched: {total_replies}")
- """
- Example prompt for ChatGPT:
- You are a data analyst specializing in social media content and sentiment analysis. I will provide you with YouTube comments data. Please analyze these comments and provide comprehensive insights in the following areas:
- 1. Sentiment and Emotional Analysis
- Analyze the overall sentiment of comments (positive, negative, neutral)
- Identify emotional patterns and themes
- Point out particularly notable emotional responses
- Highlight any interesting patterns in how viewers express their feelings
- 2. Content and Topics Analysis
- Identify main discussion themes and topics
- Find recurring subjects or patterns in discussions
- Note any interesting debates or disagreements
- Analyze references to external topics, events, or other content
- Identify inside jokes or community-specific references
- 3. Engagement Patterns
- Analyze patterns in likes and replies
- Identify what types of comments generate the most engagement
- Look for conversation chains and discussion patterns
- Note any viral or highly-referenced comments
- Examine user interaction patterns
- 4. Community Behavior
- Identify community-specific language or jargon
- Analyze how users interact with each other
- Note any community roles or recurring user types
- Identify community consensus or disagreements
- Examine response patterns to different topics
- 5. Technical Language Analysis
- Identify commonly used terms and phrases
- Analyze language complexity and style
- Note any multilingual patterns or language mixing
- Identify unique expressions or terminology
- 6. Temporal Analysis
- Note any changes in comment patterns over time
- Identify peak engagement periods
- Analyze how discussions evolved
- Note any event-related commenting patterns
- Additional Instructions:
- Support your analysis with specific examples from the comments
- Provide numerical data when relevant
- Highlight unexpected or particularly interesting findings
- Note any limitations in your analysis
- Suggest potential interpretations of patterns you identify
- If relevant, compare patterns to typical social media behavior
- Please structure your response with clear headings and bullet points for readability. Prioritize insights that would be most valuable for understanding audience engagement and community dynamics.
- For each observation, try to provide:
- The specific finding
- Supporting evidence/examples
- Possible interpretation
- Relevance or implications
- Note: If you notice any other significant patterns or insights not covered by these categories, please include them in your analysis.
- End your analysis with a summary of the most significant findings and any recommendations for future analysis or engagement strategies.
- Answer in Polish language:
- """
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement