No edit summary |
No edit summary |
||
Line 255: | Line 255: | ||
[[File:Kaleidogif1.gif]] | [[File:Kaleidogif1.gif]] | ||
[[File:Botexamplegif2.gif | [[File:Botexamplegif2.gif]] | ||
[[File:Kaleidogif2.gif]] | [[File:Kaleidogif2.gif]] | ||
[[File:Kaleidoexample3.gif. | [[File:Kaleidoexample3.gif.gif|400 px]] | ||
Revision as of 11:57, 8 October 2015
APASRI TITATARN : Bots n Plots class
First touch with Python . . . .gettin to know how to draw basic shapes
Here is my first drawing in class with coding. my sheep robot. . . funny one
My Aim about this class :
" I would like to make imageBot and furthur step would be to make it move as short animated GIF"
Homework 2nd week Assignment to make your own moving robot.
This is my CloudBot. A cloud with falling snowflakes GIF. I coded in processing with python language. Here is the code . . .
Cloud Code----
#cloud width = 400 posx = [width/4,width/3,3*width/5,width/2,width/5,2*width/3,3.8*width/5,width/2.3] def setup(): size(400,400) background(255) frameRate(12) def draw(): noStroke() background(179,226,228) change= sin(frameCount*0.1) drawcloud(change) sx= width/2 n = frameCount textsnow(posx,change) saveFrame("cloudbot##.jpg") def drawcloud(change): print change fluffy = change*5 fill(255) #cloud 1 S ellipse(1.25*width/5,0.9*height/1.8,80+fluffy,80+fluffy) #cloud 2 M ellipse(2*width/5,height/1.8-height/10,130+2*fluffy,120+1.5*fluffy) #cloud 3 L ellipse(2.7*width/5,0.95*height/1.8-height/10,120+fluffy,145+fluffy) #cloud 4 M ellipse(3.5*width/5,height/1.8-height/10,110+2.5*fluffy,110+2*fluffy) #face fill(80) ellipse(width/3,height/2,7,7) ellipse(2*width/3,height/2,7,7) def textsnow(posx,change): fill(255) textSize(32) i=0 while i <7: text("*",posx[i]+change*i,height/1.8+frameCount*(i+1)/5) i =i+1
TwitterBot :MID TERM ASSIGNMENT
======Kaleidoscope Bot=======
STEP 1: Still Kaleido
My midtern project is making a image processing bot which make a kaleidoscope style image out of an image. I want the result to look nice and can totally show the kaleidoscope style whatever the source image is.
This is the structure of how I 'kaleidify' the source image.
RESULT PICTURE
As can be seen here,in this step, the final resault take only small part of the source image.
It is quite difficult to trance back what the sorce image look like but it is the preparation for the next step.
code for the image processing
=================
from PIL import Image, ImageDraw,ImageFilter import PIL.ImageOps as im import numpy as np def makekaleido(): #put image path imgfile = "img.jpg" #load Image source = Image.open(imgfile) #adjust the source image a bit for nicer result base = source.rotate(31) #creat mask base the same size with source image mask = Image.new('RGBA', base.size, (255,255,255,0)) x, y = base.size # one variable to vary the size of triangle var = y/14 # Define triangle mask position (triangle with 20 degree) (originx,originy) = (x/3,int(0.9*y)) trih = int(12*var) #fix formular for triangle height triw = int(4.2*var) #fix formular for triangle width polygonpos = [(originx,originy), (originx+triw,originy), (originx+triw/2,originy-trih)] #print(trih,triw) # Create mask draw = ImageDraw.Draw(mask,'RGBA') draw.polygon(polygonpos,(0,0,0,255)) del draw mask.save("mask.png") # Get the Alpha band from the template tmplt = Image.open('mask.png') A = tmplt.split()[3] #make one piece of triangle on transparent bg [R,G,B]=base.split() tri = Image.merge('RGBA', (R, G, B, A)) #crop it to the exact size of triangle!! to create primary pattern #box (left, top , right, buttom) box =(originx,(originy-trih),(originx+triw),originy) pattern_plain=tri.crop(box) pattern_plain.save('pattern_plain.png') print('....pattern created....') # add style to pattern pattern = pattern_plain #pattern= pattern_plain.filter(ImageFilter.EDGE_ENHANCE) pattern.save('pattern_tri.png') print('....stylized pattern....') #make square canvas for the output (wide = double size of height of primary pattern) canvas =Image.new('RGBA',(2*trih,2*trih), (255,255,255,0)) canvas.save('tmpcanvas.png') pcanvas=Image.new('RGBA',(2*trih,2*trih), (255,255,255,0)) #put pattern on the canvas #make sure to put the tip of the triangle at the center of the canvas #because when we rotate the center of the object is the pivot point #note: paste command require the coordinate of top left corner #so point to paste the pattern is . . . ccenterx = int(trih-triw/2) canvas.paste(pattern,(ccenterx,trih)) # start rotate the pattern around every 40 degree for i in range (0,360,40): tmpcanvas = canvas tmppat = canvas.rotate(i) canvas= Image.alpha_composite(tmpcanvas,tmppat) # now we get half of the things half = canvas #mirror the half and put in the space to create simple kaleidoscpoe effect mirror = im.mirror(half) half2= mirror.rotate(20) #merge 2 half output=Image.alpha_composite(half,half2) output.save("final.png") if __name__ == '__main__': makekaleido()
STEP 2 : moving Kaleido
The next step is to make the bot gives more feeling of Kaleidoscope. The nice part of Kaleidoscope is whhen one rotate it around and see the abtract patterns moving.
I try to make it the similar feeling here with the bot that each still image is so abstact but when one finish watching the whole loop of rotation then you will have clue about what is the source image.
In the code, I made the variable 'n' and 'a' to be able to adjust the the result GIF file. n is the number of the rotation time a is the degree of the rotation
so they will affect how smooth the animated GIF going to be.
STEP 3 Making the bot works online
The @kaleidogif twiiter account is created to be the page for the bot. ( Before that I just did trial and errors with my own account)
Link : [KaleidoGIF twitter|https://twitter.com/kaleidogif]
This is the example result.
FINISHING STEP
Last step . . . for the summaery event , I improved a bit the quality of the
GIF by editing the code to be able to make best quality it can and also the final GIF is not larger that the twitter status update limit.
So I added lines of code to get the size of the final GIF and resize it if it exceed the limit.
The complete code of the final version can be found here in the GitHub link.